Inference for Projection Parameters in Linear Regression: beyond d = o(n^1/2)

arXiv (Cornell University)(2023)

引用 0|浏览7
暂无评分
摘要
We consider the problem of inference for projection parameters in linear regression with increasing dimensions. This problem has been studied under a variety of assumptions in the literature. The classical asymptotic normality result for the least squares estimator of the projection parameter only holds when the dimension d of the covariates is of a smaller order than n^1/2, where n is the sample size. Traditional sandwich estimator-based Wald intervals are asymptotically valid in this regime. In this work, we propose a bias correction for the least squares estimator and prove the asymptotic normality of the resulting debiased estimator. Precisely, we provide an explicit finite sample Berry Esseen bound on the Normal approximation to the law of the linear contrasts of the proposed estimator normalized by the sandwich standard error estimate. Our bound, under only finite moment conditions on covariates and errors, tends to 0 as long as d = o(n^2/3) up to the polylogarithmic factors. Furthermore, we leverage recent methods of statistical inference that do not require an estimator of the variance to perform asymptotically valid statistical inference and that leads to a sharper miscoverage control compared to Wald's. We provide a discussion of how our techniques can be generalized to increase the allowable range of d even further.
更多
查看译文
关键词
projection parameters,linear regression,inference
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要