Faster Riemannian Newton-type optimization by subsampling and cubic regularization

arxiv(2023)

引用 0|浏览24
暂无评分
摘要
This work is on constrained large-scale non-convex optimization where the constraint set implies a manifold structure. Solving such problems is important in a multitude of fundamental machine learning tasks. Recent advances on Riemannian optimization have enabled the convenient recovery of solutions by adapting unconstrained optimization algorithms over manifolds. However, it remains challenging to scale up and meanwhile maintain stable convergence rates and handle saddle points. We propose a new second-order Riemannian optimization algorithm, aiming at improving convergence rate and reducing computational cost. It enhances the Riemannian trust-region algorithm that explores curvature information to escape saddle points through a mixture of subsampling and cubic regularization techniques. We conduct rigorous analysis to study the convergence behavior of the proposed algorithm. We also perform extensive experiments to evaluate it based on two general machine learning tasks using multiple datasets. The proposed algorithm exhibits improved computational speed, e.g., a speed improvement from 12% to 227% , and improved convergence behavior, e.g., an iteration number reduction from 𝒪(max(ϵ_g^-2ϵ_H^-1,ϵ_H^-3)) to 𝒪(max(ϵ_g^-2,ϵ_H^-3)) , compared to a large set of state-of-the-art Riemannian optimization algorithms.
更多
查看译文
关键词
Optimization,Cubic regularization,Riemannian manifolds,Subsampling
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要