Supervised Dimensionality Reduction Methods via Recursive Regression

IEEE Transactions on Neural Networks and Learning Systems(2020)

引用 18|浏览130
暂无评分
摘要
In this article, the recursive problems of both orthogonal linear discriminant analysis (OLDA) and orthogonal least squares regression (OLSR) are investigated. Different from other works, the associated recursive problems are addressed via a novel recursive regression method, which achieves the dimensionality reduction in the orthogonal complement space heuristically. As for the OLDA, an efficient method is developed to obtain the associated optimal subspace, which is closely related to the orthonormal basis of the optimal solution to the ridge regression. As for the OLSR, the scalable subspace is introduced to build up an original OLSR with optimal scaling (OS). Through further relaxing the proposed problem into a convex parameterized orthogonal quadratic problem, an effective approach is derived, such that not only the optimal subspace can be achieved but also the OS could be obtained automatically. Accordingly, two supervised dimensionality reduction methods are proposed via obtaining the heuristic solutions to the recursive problems of the OLDA and the OLSR.
更多
查看译文
关键词
Optimal scaling (OS),orthogonal least squares regression (OLSR),orthogonal linear discriminant analysis (OLDA),recursive regression,supervised dimensionality reduction
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要