An Efficient Chaotic Gradient-Based Optimizer for Feature Selection

IEEE ACCESS(2022)

Cited 16|Views2
No score
Abstract
In many applications, selecting the optimal features is a difficult task. Numerous In optimization problems, e.g., feature selection (FS) problem, have been solved using optimization algorithms. In this paper, the most discriminating features were chosen using a new chaotic gradient-based optimizer (CGBO) that combines chaotic maps with searching iterations of the gradient-based optimizer (GBO). Ten chaotic maps were utilised to update the parameters, eliminate a local optimum and premature convergence, accelerate convergence, and enhance the efficiency of GBO. A classifier was used in FS approaches to determine the best subset of characteristics. The proposed CGBO uses the k-nearest neighbor as an objective function for the classification process in FS. Ten datasets from the UCI machine learning repository were used to validate CGBO. In an experiment, CGBO outperformed five other metaheuristic algorithms: particle swarm optimization(PSO), moth flame optimizer(MFO), sine cosine algorithm(SCA), salp swarm algorithm(SSA), and GBO. The results demonstrated the capability of CGBO to find and select the optimal feature subset, which maximized the classification performance and minimized the number of features selected efficiently compared with other metaheuristic algorithms.
More
Translated text
Key words
Optimization, Classification algorithms, Feature extraction, Genetic algorithms, Metaheuristics, Heuristic algorithms, Convergence, Feature selection, metaheuristic algorithms, chaos theory, gradient-based optimizer (GBO), k-nearest neighbors
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined