An Inexact Halpern Iteration with Application to Distributionally Robust Optimization
CoRR(2024)
摘要
The Halpern iteration for solving monotone inclusion problems has gained
increasing interests in recent years due to its simple form and appealing
convergence properties. In this paper, we investigate the inexact variants of
the scheme in both deterministic and stochastic settings. We conduct extensive
convergence analysis and show that by choosing the inexactness tolerances
appropriately, the inexact schemes admit an O(k^-1) convergence rate in
terms of the (expected) residue norm. Our results relax the state-of-the-art
inexactness conditions employed in the literature while sharing the same
competitive convergence properties. We then demonstrate how the proposed
methods can be applied for solving two classes of data-driven Wasserstein
distributionally robust optimization problems that admit convex-concave min-max
optimization reformulations. We highlight its capability of performing inexact
computations for distributionally robust learning with stochastic first-order
methods.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要