Convergence analysis of the transformed gradient projection algorithms on compact matrix manifolds
arxiv(2024)
摘要
In this paper, to address the optimization problem on a compact matrix
manifold, we introduce a novel algorithmic framework called the Transformed
Gradient Projection (TGP) algorithm, using the projection onto this compact
matrix manifold. Compared with the existing algorithms, the key innovation in
our approach lies in the utilization of a new class of search directions and
various stepsizes, including the Armijo, nonmonotone Armijo, and fixed
stepsizes, to guide the selection of the next iterate. Our framework offers
flexibility by encompassing the classical gradient projection algorithms as
special cases, and intersecting the retraction-based line-search algorithms.
Notably, our focus is on the Stiefel or Grassmann manifold, revealing that many
existing algorithms in the literature can be seen as specific instances within
our proposed framework, and this algorithmic framework also induces several new
special cases. Then, we conduct a thorough exploration of the convergence
properties of these algorithms, considering various search directions and
stepsizes. To achieve this, we extensively analyze the geometric properties of
the projection onto compact matrix manifolds, allowing us to extend classical
inequalities related to retractions from the literature. Building upon these
insights, we establish the weak convergence, convergence rate, and global
convergence of TGP algorithms under three distinct stepsizes. In cases where
the compact matrix manifold is the Stiefel or Grassmann manifold, our
convergence results either encompass or surpass those found in the literature.
Finally, through a series of numerical experiments, we observe that the TGP
algorithms, owing to their increased flexibility in choosing search directions,
outperform classical gradient projection and retraction-based line-search
algorithms in several scenarios.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要