Simultaneous feature selection and weighting – An evolutionary multi-objective optimization approach
Pattern Recognition Letters(2015)
摘要
Selection of feature subset is a preprocessing step in computational learning, and it serves several purposes like reducing the dimensionality of a dataset, decreasing the computational time required for classification and enhancing the classification accuracy of a classifier by removing redundant and misleading or erroneous features. This paper presents a new feature selection and weighting method aided with the decomposition based evolutionary multi-objective algorithm called MOEA/D. The feature vectors are selected and weighted or scaled simultaneously to project the data points to such a hyper space, where the distance between data points of non-identical classes is increased, thus, making them easier to classify. The inter-class and intra-class distances are simultaneously optimized by using MOEA/D to obtain the optimal features and the scaling factor associated with them. Finally, k-NN (k-Nearest Neighbor) is used to classify the data points having the reduced and weighted feature set. The proposed algorithm is tested with several practical datasets from the well-known data repositories like UCI and LIBSVM. The results are compared with those obtained with the state-of-the-art algorithms to demonstrate the superiority of the proposed algorithm. Presents a simultaneous feature selection and weighting method.Use of penalty to reduce number of selected features.Use of very competitive MOEA/D as a core optimizer.Best compromise solution to obtain the best feature selection and weighting vector.Evaluation on UCI and LIBSVM datasets.
更多查看译文
关键词
Feature selection,Feature weighting,Evolutionary multi-objective optimization,MOEA/D,Inter- and intra-class distances
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络