Nesterov's Accelerated Gradient Descent: The Controlled Contraction Approach.

IEEE Control Systems Letters(2024)

Cited 0|Views2
No score
Abstract
Nesterov’s Accelerated Gradient (NAG) algorithm is a popular algorithm that provides a faster convergence to the optimal solution of an optimization problem. Despite its popularity, the origin of this algorithm is still a conceptual mystery that has motivated the proposed control theoretic perspective. The paper has derived the second-order ODE for Nesterov’s Accelerated Gradient algorithm for strongly convex functions (NAG-SC) through the notions of manifold stabilization achieved with the recently introduced P&I approach and persistence of an invariant manifold. Furthermore, the contraction of the Nesterov’s flows under the control actions (i.e., Controlled Contraction (CC)) is also proved. The contraction of Nesterov’s flows not only ensures a stable numerical integration but also reveals the multiple potentials of the NAG-SC method that motivate its usefulness in machine learning and deep learning applications.
More
Translated text
Key words
Contraction theory,Nesterov’s Accelerated Gradient Descent,Optimization,P&I Approach
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined