Chrome Extension
WeChat Mini Program
Use on ChatGLM

A Linearly Convergent Algorithm for Rotationally Invariant $\ell_1$-Norm Principal Component Analysis

arXiv (Cornell University)(2022)

Cited 0|Views0
No score
Abstract
To do dimensionality reduction on the datasets with outliers, the $\ell_1$-norm principal component analysis (L1-PCA) as a typical robust alternative of the conventional PCA has enjoyed great popularity over the past years. In this work, we consider a rotationally invariant L1-PCA, which is hardly studied in the literature. To tackle it, we propose a proximal alternating linearized minimization method with a nonlinear extrapolation for solving its two-block reformulation. Moreover, we show that the proposed method converges at least linearly to a limiting critical point of the reformulated problem. Such a point is proved to be a critical point of the original problem under a condition imposed on the step size. Finally, we conduct numerical experiments on both synthetic and real datasets to support our theoretical developments and demonstrate the efficacy of our approach.
More
Translated text
Key words
principal component analysis,linearly convergent algorithm,rotationally invariant
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined