Group Sparse Additive Machine with Average Top-k Loss

Neurocomputing(2020)

引用 1|浏览95
暂无评分
摘要
Sparse additive models have shown competitive performance for high-dimensional variable selection and prediction due to their representation flexibility and interpretability. Despite their theoretical properties have been studied extensively, few works have addressed the robustness for the sparse additive models. In this paper, we employ the robust average top-k (ATk) loss as classification error measure and propose a new sparse algorithm, named ATk group sparse additive machine (ATk-GSAM). Besides the robust concern, the ATk-GSAM has well adaptivity by integrating the data dependent hypothesis space and group sparse regularizer together. Generalization error bound is established by the concentration estimate with empirical covering numbers. In particular, our error analysis shows that ATk-GSAM can achieve the learning rate O(n−1/2) under appropriate conditions. We further analyze the robustness of ATk-GSAM via a sample-weighted procedure interpretation, and the theoretical guarantees on grouped variable selection. Experimental evaluations on both simulated and benchmark datasets validate the effectiveness and robustness of the new algorithm.
更多
查看译文
关键词
Average top-k loss,Additive models,Generalization error,Data dependent hypothesis space,Robustness
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要