Improved Estimation of High-dimensional Additive Models Using Subspace Learning

JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS(2022)

引用 0|浏览13
暂无评分
摘要
Additive models have been widely used as a flexible nonparametric regression method that can overcome the curse of dimensionality. Using sparsity-inducing penalty for variable selection, several methods are developed for fitting additive models when the number of predictors is very large, sometimes even larger than the sample size. However, despite good asymptotic properties, the finite sample performance of these methods may deteriorate considerably when the number of relevant predictors becomes moderately large. This article proposes a new method that reduces the number of unknown functions to be nonparametrically estimated through learning a predictive subspace representation shared by the additive component functions. The subspace learning is integrated with sparsity-inducing penalization in a penalized least squares formulation and an efficient algorithm is developed for computation involving Stiefel matrix manifold optimization and proximal thresholding operators on matrices. Theoretical convergence properties of the algorithm are studied. The proposed method is shown to be competitive with existing methods in simulation studies and a real data example. Supplementary materials for this article are available online.
更多
查看译文
关键词
Adaptive group Lasso, Dimensionality reduction, Low rank approximation, Polynomial splines, Sparsity, Variable selection
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要