谷歌Chrome浏览器插件
订阅小程序
在清言上使用

AdaSub: Stochastic Optimization Using Second-Order Information in Low-Dimensional Subspaces

João Victor Galvão da Mata,Martin S. Andersen

CoRR(2023)

引用 0|浏览3
暂无评分
摘要
We introduce AdaSub, a stochastic optimization algorithm that computes a search direction based on second-order information in a low-dimensional subspace that is defined adaptively based on available current and past information. Compared to first-order methods, second-order methods exhibit better convergence characteristics, but the need to compute the Hessian matrix at each iteration results in excessive computational expenses, making them impractical. To address this issue, our approach enables the management of computational expenses and algorithm efficiency by enabling the selection of the subspace dimension for the search. Our code is freely available on GitHub, and our preliminary numerical results demonstrate that AdaSub surpasses popular stochastic optimizers in terms of time and number of iterations required to reach a given accuracy.
更多
查看译文
关键词
Stochastic optimization,subspace optimization,low-dimensional optimization,stochastic quasi-Newton methods
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要