A Novel Two-step Sparse Learning Approach for Variable Selection and Optimal Predictive Modeling

IFAC-PapersOnLine(2022)

Cited 1|Views11
No score
Abstract
In this paper, a two-step sparse learning approach is proposed for variable selection and model parameter estimation with optimally tuned hyperparameters in each step. In Step one, a sparse learning algorithm is applied on all data to produce a sequence of candidate subsets of selected variables by varying the hyperparameter value. In Step two, for each subset of the selected variables from Step one, Lasso, ridge regression, elastic-net, or adaptive Lasso is employed to find the optimal hyperparameters with the best cross-validation error. Among all subsets, the one with the overall minimum cross-validation error is selected as globally optimal. The effectiveness of the proposed approach is demonstrated using an industrial NOx emission dataset and the Dow challenge dataset to predict product impurity.
More
Translated text
Key words
Inferential modeling,sparse statistical learning,variable selection,regularization,industrial applications
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined