Robust Principal Component Analysis Using Alpha Divergence

2020 IEEE International Conference on Image Processing (ICIP)(2020)

引用 1|浏览11
暂无评分
摘要
In this paper, a new robust principal component analysis (RPCA) method which enables us to exploit the main components of a given corrupted data with non Gaussian outliers is proposed. This method is based on the $\alpha-$divergence which is a parametric measure from information geometry. The proposed method is adjustable using a hyperparameter $\alpha$ and reduces to the classical PCA as a particular case. In order to derive the main components, the $\alpha-$divergence between the empirical data distribution and the assumed model for the distribution is minimized with respect to the unknown parameters. The singular value decomposition (SVD) of the estimated covariance matrix is then used to exploit the main direction of the data. The proposed method is applied to some video and signal processing applications and the results show the superiority of the proposed method over classical PCA and other existing robust methods.
更多
查看译文
关键词
Robustness,Principal component analysis,Covariance matrices,Estimation,Loading,Singular value decomposition,Sparse matrices
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要