Quantile universal threshold

ELECTRONIC JOURNAL OF STATISTICS(2017)

引用 14|浏览1
暂无评分
摘要
Efficient recovery of a low-dimensional structure from high-dimensional data has been pursued in various settings including wavelet denoising, generalized linear models and low-rank matrix estimation. By thresholding some parameters to zero, estimators such as lasso, elastic net and subset selection perform variable selection. One crucial step challenges all these estimators: the amount of thresholding governed by a threshold parameter lambda. If too large, important features are missing; if too small, incorrect features are included. Within a unified framework, we propose a selection of. at the detection edge. To that aim, we introduce the concept of a zero-thresholding function and a null-thresholding statistic, that we explicitly derive for a large class of estimators. The new approach has the great advantage of transforming the selection of lambda from an unknown scale to a probabilistic scale. Numerical results show the effectiveness of our approach in terms of model selection and prediction.
更多
查看译文
关键词
Convex optimization,high-dimensionality,sparsity,regularization,thresholding
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要