Fast Optimistic Gradient Descent Ascent (OGDA) Method in Continuous and Discrete Time

Foundations of Computational Mathematics(2023)

引用 0|浏览1
暂无评分
摘要
In the framework of real Hilbert spaces, we study continuous in time dynamics as well as numerical algorithms for the problem of approaching the set of zeros of a single-valued monotone and continuous operator V. The starting point of our investigations is a second-order dynamical system that combines a vanishing damping term with the time derivative of V along the trajectory, which can be seen as an analogous of the Hessian-driven damping in case the operator is originating from a potential. Our method exhibits fast convergence rates of order o ( 1/tβ (t)) for ‖ V(z(t))‖ , where z(· ) denotes the generated trajectory and β (· ) is a positive nondecreasing function satisfying a growth condition, and also for the restricted gap function, which is a measure of optimality for variational inequalities. We also prove the weak convergence of the trajectory to a zero of V. Temporal discretizations of the dynamical system generate implicit and explicit numerical algorithms, which can be both seen as accelerated versions of the Optimistic Gradient Descent Ascent (OGDA) method for monotone operators, for which we prove that the generated sequence of iterates (z_k)_k ≥ 0 shares the asymptotic features of the continuous dynamics. In particular we show for the implicit numerical algorithm convergence rates of order o ( 1/kβ _k) for ‖ V(z^k)‖ and the restricted gap function, where (β _k)_k ≥ 0 is a positive nondecreasing sequence satisfying a growth condition. For the explicit numerical algorithm, we show by additionally assuming that the operator V is Lipschitz continuous convergence rates of order o ( 1/k) for ‖ V(z^k)‖ and the restricted gap function. All convergence rate statements are last iterate convergence results; in addition to these, we prove for both algorithms the convergence of the iterates to a zero of V. To our knowledge, our study exhibits the best-known convergence rate results for monotone equations. Numerical experiments indicate the overwhelming superiority of our explicit numerical algorithm over other methods designed to solve monotone equations governed by monotone and Lipschitz continuous operators.
更多
查看译文
关键词
Monotone equation,Variational inequality,Optimistic Gradient Descent Ascent (OGDA) method,Extragradient method,Nesterov's accelerated gradient method,Lyapunov analysis,Convergence rates,Convergence of trajectories,Convergence of iterates
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要