General Hölder Smooth Convergence Rates Follow from Specialized Rates Assuming Growth Bounds

Journal of Optimization Theory and Applications(2023)

引用 1|浏览0
暂无评分
摘要
Often in the analysis of first-order methods for both smooth and nonsmooth optimization, assuming the existence of a growth/error bound or KL condition facilitates much stronger convergence analysis. Hence, separate analysis is typically needed for the general case and for the growth bounded cases. We give meta-theorems for deriving general convergence rates from those assuming a growth lower bound. Applying this simple but conceptually powerful tool to the proximal point, subgradient, bundle, dual averaging, gradient descent, Frank–Wolfe and universal accelerated methods immediately recovers their known convergence rates for general convex optimization problems from their specialized rates. New convergence results follow for bundle methods, dual averaging and Frank–Wolfe. Our results can lift any rate based on Hölder continuous gradients and Hölder growth bounds. Moreover, our theory provides simple proofs of optimal convergence lower bounds under Hölder growth from textbook examples without growth bounds.
更多
查看译文
关键词
First-order methods,Convex optimization,Convergence rates,Growth/error bounds,Restarting schemes
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要