Chrome Extension
WeChat Mini Program
Use on ChatGLM

Feature selection for high-dimensional regression via sparse LSSVR based on Lp-norm

INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS(2021)

Cited 7|Views23
No score
Abstract
When solving many regression problems, there exist a large number of input features. However, not all features are relevant for current regression, and sometimes, including irrelevant features may deteriorate the learning performance. Therefore, it is essential to select the most relevant features, especially for high-dimensional regression. Feature selection is an effective way to solve this problem. It tries to represent original data by extracting relevant features that contain useful information. In this paper, aiming to effectively select useful features in least squares support vector regression (LSSVR), we propose a novel sparse LSSVR based on L-p-norm (SLSSVR), 0 < p <= 1. Different from the existing L-1-norm LSSVR (L-1-LSSVR) and L-p-norm LSSVR (L-p-LSSVR), SLSSVR uses a smooth approximation of the nonsmooth nonconvex L-p-norm term along with an effective solving algorithm. The proposed algorithm avoids the singularity issue that may encounter in L-p-LSSVR, and its convergency is also guaranteed. Experimental results support the effectiveness of SLSSVR on both feature selection ability and regression performance.
More
Translated text
Key words
feature selection,least squares support vector regression,sparseness,support vector regression
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined