Chrome Extension
WeChat Mini Program
Use on ChatGLM

Provable non-accelerations of the heavy-ball method

arXiv (Cornell University)(2023)

Cited 0|Views9
No score
Abstract
In this work, we show that the heavy-ball ($\HB$) method provably does not reach an accelerated convergence rate on smooth strongly convex problems. More specifically, we show that for any condition number and any choice of algorithmic parameters, either the worst-case convergence rate of $\HB$ on the class of $L$-smooth and $\mu$-strongly convex \textit{quadratic} functions is not accelerated (that is, slower than $1 - \mathcal{O}(\kappa)$), or there exists an $L$-smooth $\mu$-strongly convex function and an initialization such that the method does not converge. To the best of our knowledge, this result closes a simple yet open question on one of the most used and iconic first-order optimization technique. Our approach builds on finding functions for which $\HB$ fails to converge and instead cycles over finitely many iterates. We analytically describe all parametrizations of $\HB$ that exhibit this cycling behavior on a particular cycle shape, whose choice is supported by a systematic and constructive approach to the study of cycling behaviors of first-order methods. We show the robustness of our results to perturbations of the cycle, and extend them to class of functions that also satisfy higher-order regularity conditions.
More
Translated text
Key words
non-accelerations,heavy-ball
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined