Feature Selection Boosted by Unselected Features

IEEE Transactions on Neural Networks and Learning Systems(2022)

引用 21|浏览54
暂无评分
摘要
Feature selection aims to select strongly relevant features and discard the rest. Recently, embedded feature selection methods, which incorporate feature weights learning into the training process of a classifier, have attracted much attention. However, traditional embedded methods merely focus on the combinatorial optimality of all selected features. They sometimes select the weakly relevant features with satisfactory combination abilities and leave out some strongly relevant features, thereby degrading the generalization performance. To address this issue, we propose a novel embedded framework for feature selection, termed feature selection boosted by unselected features (FSBUF). Specifically, we introduce an extra classifier for unselected features into the traditional embedded model and jointly learn the feature weights to maximize the classification loss of unselected features. As a result, the extra classifier recycles the unselected strongly relevant features to replace the weakly relevant features in the selected feature subset. Our final objective can be formulated as a minimax optimization problem, and we design an effective gradient-based algorithm to solve it. Furthermore, we theoretically prove that the proposed FSBUF is able to improve the generalization ability of traditional embedded feature selection methods. Extensive experiments on synthetic and real-world data sets exhibit the comprehensibility and superior performance of FSBUF.
更多
查看译文
关键词
Feature selection,generalization ability,joint learning,minimax optimization,unselected features
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要