Chrome Extension
WeChat Mini Program
Use on ChatGLM

ON THE ROBUSTNESS OF MINIMUM NORM INTERPOLATORS AND REGULARIZED EMPIRICAL RISK MINIMIZERS

ANNALS OF STATISTICS(2022)

Cited 6|Views3
No score
Abstract
This article develops a general theory for minimum norm interpolating estimators and regularized empirical risk minimizers (RERM) in linear models in the presence of additive, potentially adversarial, errors. In particular, no conditions on the errors are imposed. A quantitative bound for the prediction error is given, relating it to the Rademacher complexity of the covariates, the norm of the minimum norm interpolator of the errors and the size of the subdifferential around the true parameter. The general theory is illustrated for Gaussian features and several norms: The l(1), l(2), group Lasso and nuclear norms. In case of sparsity or low-rank inducing norms, minimum norm interpolators and RERM yield a prediction error of the order of the average noise level, provided that the overparameterization is at least a logarithmic factor larger than the number of samples and that, in case of RERM, the regularization parameter is small enough. Lower bounds that show near optimality of the results complement the analysis.
More
Translated text
Key words
Sparse linear regression, regularization, basis pursuit, trace regression, interpolation, minimum norm interpolation
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined