An inexact accelerated stochastic ADMM for separable convex optimization
Computational Optimization and Applications(2022)
摘要
n inexact accelerated stochastic Alternating Direction Method of Multipliers (AS-ADMM) scheme is developed for solving structured separable convex optimization problems with linear constraints. The objective function is the sum of a possibly nonsmooth convex function and a smooth function which is an average of many component convex functions. Problems having this structure often arise in machine learning and data mining applications. AS-ADMM combines the ideas of both ADMM and the stochastic gradient methods using variance reduction techniques. One of the ADMM subproblems employs a linearization technique while a similar linearization could be introduced for the other subproblem. For a specified choice of the algorithm parameters, it is shown that the objective error and the constraint violation are 𝒪(1/k) relative to the number of outer iterations k . Under a strong convexity assumption, the expected iterate error converges to zero linearly. A linearized variant of AS-ADMM and incremental sampling strategies are also discussed. Numerical experiments with both stochastic and deterministic ADMM algorithms show that AS-ADMM can be particularly effective for structured optimization arising in big data applications.
更多查看译文
关键词
Convex optimization,Separable structure,Accelerated stochastic ADMM,Inexact stochastic ADMM,AS-ADMM,Accelerated gradient method,Complexity,Big data
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络