Accelerating Stochastic Gradient Descent for Least Squares Regression.

COLT(2018)

引用 141|浏览138
暂无评分
摘要
There is widespread sentiment that fast gradient methods (e.g. Nesterovu0027s acceleration, conjugate gradient, heavy ball) are not effective for the purposes of stochastic optimization due to their instability and error accumulation. Numerous works have attempted to quantify these instabilities in the face of either statistical or non-statistical errors [Paige71, Proakis74, Polyak87, Greenbaum89, Roy and Shynk90, Sharma et al. 98, dAspremont08, Devolder et al. 13,14, Yuan et al. 16]. This work considers these issues for the special case of stochastic approximation for the least squares regression problem, and our main result refutes this conventional wisdom by showing that acceleration can be made robust to statistical errors. In particular, this work introduces an accelerated stochastic gradient method that provably achieves the minimax optimal statistical risk faster than stochastic gradient descent. Critical to the analysis is a sharp characterization of accelerated stochastic gradient descent as a stochastic process. We hope this characterization gives insights towards the broader question of designing simple and effective accelerated stochastic methods for more general convex and non-convex optimization problems.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要