Pafs - An Efficient Method For Classifier-Specific Feature Selection

2016 IEEE Symposium Series on Computational Intelligence (SSCI)(2016)

引用 2|浏览1
暂无评分
摘要
An optimal classification model for classifying on a given problem should comprise of a classifier, a proper feature subset and a parameter set such that the classifier can attain high prediction performance as possible. Many recent feature selection methods are either too exhaustive or too greedy. Besides, many classification approaches conduct parameter search after feature selection stage, resulting in the classification results that are not as optimal as they should. In this study, we propose a new greedy selection method, called Parallel Apriori-like Feature Selection (PAFS), which searches for an optimal classification model in the combined space of features and parameters. Moreover, its greedy search behavior is controllable by running options so that it is flexible for different problems. We also devised a Tree-based Classifier Model (TCM) algorithm which wraps PAFS in solving multi-class problems. Our methods achieved excellent results when applied on two multi-class datasets. In particular, on a breast cancer dataset consisting of 5 classes and 13582 features, our methods selected feature subsets of no more than 10 features and each with the prediction accuracy of at least 94%.
更多
查看译文
关键词
Feature selection,grid search,Parallel Apriori-like Feature Selection,PAFS,Tree-based Model for Multi-class Problem,optimal classification model search
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要