Low Rank Matrix Differential Equations as Deep Kernel Principal Component Analysis

Shimin Shan,Yanbo Wang, Wenyu Chen, Fangshu Cui

2023 China Automation Congress (CAC)(2023)

引用 0|浏览0
暂无评分
摘要
The overall goal of the paper is to develop a deep kernel principal component analysis (KPCA) for time-dependent data that are nonlinearly distributed in high dimensions. Instead of approaching a temporal and high dimensional dataset with stationary snapshots, we propose a continuous method with a group of nonlinear differential equations being derived from the original KPCA (to project the KPCA to a tangent space), suitable for numerical integrations. To validate our approach, we perform an error analysis by comparing our method using continuous information with the standard KPCA using individual snapshots. Specifically, we demonstrate that the errors naturally would occur during any kernalizations will not be significantly magnified following our dynamic low-rank approximation. In addition, our approach satisfies the local quasi optimality. Finally, we show excellent performances of our methods compared with traditional ones using both synthetic and real datasets. Overall, we develop a novel and differential-based framework aimed to capture mainly the temporal changes occurring during a dynamic process, ideal for its dimensional reduction and classification.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要