Convergence Rates of Zeroth Order Gradient Descent for ?ojasiewicz Functions

INFORMS JOURNAL ON COMPUTING(2024)

引用 0|浏览0
暂无评分
摘要
We prove convergence rates of Zeroth -order Gradient Descent (ZGD) algorithms for Lojasiewicz functions. Our results show that for smooth Lojasiewicz functions with Lojasiewicz exponent larger than 0.5 and smaller than 1, the functions values can converge much faster than the (zeroth -order) gradient descent trajectory. Similar results hold for convex nonsmooth Lojasiewicz functions.
更多
查看译文
关键词
optimization,zeroth order optimization,Lojasiewicz functions
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要