An Almost Sure Convergence Analysis of Zeroth-Order Mirror Descent Algorithm

2023 AMERICAN CONTROL CONFERENCE, ACC(2023)

引用 0|浏览0
暂无评分
摘要
In this paper, we show almost sure convergence of zeroth-order mirror descent algorithm. The algorithm admits non-smooth convex functions and assumes only an estimate of the gradient is available, obtained using Nesterov's Gausssian Approximation technique (NGA). We establish that under suitable condition of step-size, the function value of the iterates of the algorithm converge to a neighborhood of the optimal function value almost surely. We extend the analysis to the distributed implementation of the zeroth-order mirror descent algorithm.
更多
查看译文
关键词
derivative-free optimization,Distributed Optimization,almost sure convergence
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要