Analysis of the BFGS Method with Errors

SIAM JOURNAL ON OPTIMIZATION(2020)

引用 23|浏览93
暂无评分
摘要
The classical convergence analysis of quasi-Newton methods assumes that function and gradient evaluations are exact. In this paper, we consider the case when there are (bounded) errors in both computations and establish conditions under which a slight modification of the BFGS algorithm with an Armijo-Wolfe line search converges to a neighborhood of the solution that is determined by the size of the errors. One of our results is an extension of the analysis presented in [R. H. Byrd and J. Nocedal, SIAM T. Numer. Anal., 26 (1989), pp. 727-739], which establishes that, for strongly convex functions, a fraction of the BFGS iterates are good iterates. We present numerical results illustrating the performance of the new BFGS method in the presence of noise.
更多
查看译文
关键词
nonlinear optimization,quasi-Newton method,stochastic optimization
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要