Stable Distillation and High-Dimensional Hypothesis Testing
arxiv(2022)
摘要
While powerful methods have been developed for high-dimensional hypothesis
testing assuming orthogonal parameters, current approaches struggle to
generalize to the more common non-orthogonal case. We propose Stable
Distillation (SD), a simple paradigm for iteratively extracting independent
pieces of information from observed data, assuming a parametric model. When
applied to hypothesis testing for large regression models, SD orthogonalizes
the effect estimates of non-orthogonal predictors by judiciously introducing
noise into the observed outcomes vector, yielding mutually independent p-values
across predictors. Simulations and a real regression example using US campaign
contributions show that SD yields a scalable approach for non-orthogonal
designs that exceeds or matches the power of existing methods against sparse
alternatives. While we only present explicit SD algorithms for hypothesis
testing in ordinary least squares and logistic regression, we provide general
guidance for deriving and improving the power of SD procedures.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要