Optimal post-selection inference for sparse signals: a nonparametric empirical Bayes approach

arxiv(2022)

引用 12|浏览10
暂无评分
摘要
Many recently developed Bayesian methods focus on sparse signal detection. However, much less work has been done on the natural follow-up question: how does one make valid inferences for the magnitude of those signals after selection? Ordinary Bayesian credible intervals suffer from selection bias, as do ordinary frequentist confidence intervals. Existing Bayesian methods for correcting this bias produce credible intervals with poor frequentist properties. Further, existing frequentist approaches require sacrificing the benefits of shrinkage typical in Bayesian methods, resulting in confidence intervals that are needlessly wide. We address this gap by proposing a nonparametric empirical Bayes approach to constructing optimal selection-adjusted confidence sets. Our method produces confidence sets that are as short as possible on average, while both adjusting for selection and maintaining exact frequentist coverage uniformly over the parameter space. We demonstrate an important consistency property of our procedure: under mild conditions, it asymptotically converges to the results of an oracle-Bayes analysis in which the prior distribution of signal sizes is known exactly. Across a series of examples, the method is found to outperform existing frequentist techniques for post-selection inference, producing confidence sets that are notably shorter, but with the same coverage guarantee.
更多
查看译文
关键词
Biased test, Coverage, Post-selection inference, Selection bias, Shrinkage
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要