Nesterovs Accelerated Gradient Descent: The Controlled Contraction Approach

IEEE CONTROL SYSTEMS LETTERS(2024)

引用 0|浏览0
暂无评分
摘要
Nesterov's Accelerated Gradient (NAG) algorithm is a popular algorithm that provides a faster convergence to the optimal solution of an optimization problem. Despite its popularity, the origin of this algorithm is still a conceptual mystery that has motivated the proposed control theoretic perspective. This letter has derived the second-order ODE for Nesterov's Accelerated Gradient algorithm for strongly convex functions (NAG-SC) through the notions of manifold stabilization achieved with the recently introduced P&I approach and persistence of an invariant manifold. Furthermore, the contraction of the Nesterov's flows under the control actions (i.e., Controlled Contraction (CC)) is also proved. The contraction of Nesterov's flows not only ensures a stable numerical integration but also reveals the multiple potentials of the NAG-SC method that motivate its usefulness in machine learning and deep learning applications.
更多
查看译文
关键词
Contraction theory,Nesterov's Accelerated Gradient Descent,optimization,P&I approach
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要