谷歌Chrome浏览器插件
订阅小程序
在清言上使用

How Certain are You of Your Minimum AIC or BIC Values?

I.M.L. Nadeesha Jayaweera,A. Alexandre Trindade

Sankhya A(2024)

引用 0|浏览1
暂无评分
摘要
In choosing a candidate model in likelihood-based inference by minimizing an information criterion, the practitioner is often faced with the difficult task of deciding how far up the ranked list to look. Motivated by this pragmatic necessity, we derive an approximation to the quantiles of a generalized (model selection) information criterion (ZIC), defined as a criterion for which the limit in probability is identical to that of the normalized log-likelihood, and which includes common special cases such as AIC and BIC. The method starts from the joint asymptotic normality of the ZIC values, and proceeds by deriving the (asymptotically) exact distribution of the minimum, which can be efficiently (numerically) computed. High quantiles can then be obtained by inverting this distribution function, resulting in what we call a certainty envelope (CE) of plausible models, intended to provide a heuristic upper bound on the location of the actual minimum. The theory is established for three data settings of perennial classical interest: (i) independent and identically distributed, (ii) regression, and (iii) time series. The development in the latter two cases invokes Lindeberg-Feller type conditions for, respectively, normalized: sums of conditional distributions and quadratic forms, in the observations. The performance of the methodology is examined on simulated data by assessing CE nominal coverage probabilities, and comparing them to the bootstrap. Both approaches give coverages close to nominal for large samples, but the bootstrap is on average two orders of magnitude slower. Finally, we hint at the possibility of producing confidence intervals for individual parameters by pivoting the distribution of the minimum ZIC, thus naturally accounting for post-model selection uncertainty.
更多
查看译文
关键词
Maximum likelihood,model selection,Kullback-Leibler discrepancy,asymptotic normality,post-model selection inference,Primary 62F12,Secondary 62F40
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要