An optimally fast objective-function-free minimization algorithm using random subspaces

arxiv(2023)

引用 0|浏览0
暂无评分
摘要
An algorithm for unconstrained non-convex optimization is described, which does not evaluate the objective function and in which minimization is carried out, at each iteration, within a randomly selected subspace. It is shown that this random approximation technique does not affect the method's convergence nor its evaluation complexity for the search of an $\epsilon$-approximate first-order critical point, which is $\mathcal{O}(\epsilon^{-(p+1)/p})$, where $p$ is the order of derivatives used. A variant of the algorithm using approximate Hessian matrices is also analyzed and shown to require at most $\mathcal{O}(\epsilon^{-2})$ evaluations. Preliminary numerical tests show that the random-subspace technique can significantly improve performance on some problems, albeit, unsurprisingly, not for all.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要