Analysis of the Variance Reduction in SVRG and a New Acceleration Method.

IEEE ACCESS(2018)

引用 5|浏览9
暂无评分
摘要
Stochastic gradient descent is a popular method in large-scale optimization for machine learning but suffers from a slow convergence. In recent years, stochastic variance reduced gradient (SVRG) is proposed to remedy this problem. Although many variants of SVRG have been studied, the analysis of variance has not been thoroughly discussed. In this paper, we propose a general framework denoted by epoch-update-indentification (EUI), which is an abstraction of the existing variants of SVRG. Under this framework i.e., EUI, we then provide a general analysis of the variance reduction technique from a new perspective. Additionally, those previous variants of SVRG have to keep a snapshot of the full gradient for each epoch, which is computationally expensive. In this paper, we also propose a new variant of SVRG named sampleVR which estimates the snapshot of the full gradient by using a sampling strategy, thus leading to decrease the gradient complexity significantly. Both the theoretical analysis and extensive empirical studies show that sampleVR achieves a good tradeoff between convergence performance and gradient complexity, and thus makes the training loss converge faster than its counterparts.
更多
查看译文
关键词
Variance reduction,stochastic gradient descent,SVRG,sample
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要