Local Linear Convergence of the ADMM/Douglas-Rachford Algorithms without Strong Convexity and Application to Statistical Imaging.
SIAM JOURNAL ON IMAGING SCIENCES(2016)
摘要
We consider the problem of minimizing the sum of a convex function and a convex function composed with an injective linear mapping. For such problems, subject to a coercivity condition at fixed points of the corresponding Picard iteration, iterates of the alternating directions method of multipliers converge locally linearly to points from which the solution to the original problem can be computed. Our proof strategy uses duality and strong metric subregularity of the Douglas-Rachford fixed point mapping. Our analysis does not require strong convexity and yields error bounds to the set of model solutions. We show in particular that convex piecewise linear-quadratic functions naturally satisfy the requirements of the theory, guaranteeing eventual linear convergence of both the Douglas-Rachford algorithm and the alternating directions method of multipliers for this class of objectives under mild assumptions on the set of fixed points. We demonstrate this result on quantitative image deconvolution and denoising with multiresolution statistical constraints.
更多查看译文
关键词
augmented Lagrangian,ADMM,Douglas-Rachford,exact penalization,fixed point theory,image processing,inverse problems,metric regularity,statistical multiscale analysis,piecewise linear-quadratic,linear convergence
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络