Fast and safe: accelerated gradient methods with optimality certificates and underestimate sequences

COMPUTATIONAL OPTIMIZATION AND APPLICATIONS(2021)

引用 0|浏览4
暂无评分
摘要
In this work we introduce the concept of an Underestimate Sequence (UES), which is motivated by Nesterov’s estimate sequence. Our definition of a UES utilizes three sequences, one of which is a lower bound (or under-estimator) of the objective function. The question of how to construct an appropriate sequence of lower bounds is addressed, and we present lower bounds for strongly convex smooth functions and for strongly convex composite functions, which adhere to the UES framework. Further, we propose several first order methods for minimizing strongly convex functions in both the smooth and composite cases. The algorithms, based on efficiently updating lower bounds on the objective functions, have natural stopping conditions that provide the user with a certificate of optimality. Convergence of all algorithms is guaranteed through the UES framework, and we show that all presented algorithms converge linearly, with the accelerated variants enjoying the optimal linear rate of convergence.
更多
查看译文
关键词
Underestimate sequence, Estimate sequence, Quadratic averaging, Lower bounds, Strongly convex, Smooth minimization, Composite minimization, Accelerated algorithms
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要