Asymptotically free sketched ridge ensembles: Risks, cross-validation, and tuning
arxiv(2023)
摘要
We employ random matrix theory to establish consistency of generalized cross
validation (GCV) for estimating prediction risks of sketched ridge regression
ensembles, enabling efficient and consistent tuning of regularization and
sketching parameters. Our results hold for a broad class of asymptotically free
sketches under very mild data assumptions. For squared prediction risk, we
provide a decomposition into an unsketched equivalent implicit ridge bias and a
sketching-based variance, and prove that the risk can be globally optimized by
only tuning sketch size in infinite ensembles. For general subquadratic
prediction risk functionals, we extend GCV to construct consistent risk
estimators, and thereby obtain distributional convergence of the GCV-corrected
predictions in Wasserstein-2 metric. This in particular allows construction of
prediction intervals with asymptotically correct coverage conditional on the
training data. We also propose an "ensemble trick" whereby the risk for
unsketched ridge regression can be efficiently estimated via GCV using small
sketched ridge ensembles. We empirically validate our theoretical results using
both synthetic and real large-scale datasets with practical sketches including
CountSketch and subsampled randomized discrete cosine transforms.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要