Accelerated Parameter-Free Stochastic Optimization
arxiv(2024)
摘要
We propose a method that achieves near-optimal rates for smooth stochastic
convex optimization and requires essentially no prior knowledge of problem
parameters. This improves on prior work which requires knowing at least the
initial distance to optimality d0. Our method, U-DoG, combines UniXGrad (Kavis
et al., 2019) and DoG (Ivgi et al., 2023) with novel iterate stabilization
techniques. It requires only loose bounds on d0 and the noise magnitude,
provides high probability guarantees under sub-Gaussian noise, and is also
near-optimal in the non-smooth case. Our experiments show consistent, strong
performance on convex problems and mixed results on neural network training.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要