Maximum entropy derived and generalized under idempotent probability to address Bayes-frequentist uncertainty and model revision uncertainty: An information-theoretic semantics for possibility theory

Fuzzy Sets Syst.(2023)

引用 8|浏览2
暂无评分
摘要
Typical statistical methods of data analysis only handle determinate uncertainty, the type of uncertainty that can be modeled under the Bayesian or confidence theories of inference. An example of indeterminate uncertainty is uncertainty about whether the Bayesian theory or the frequentist theory is better suited to the problem at hand. Another example is uncertainty about how to modify a Bayesian model upon learning that its prior is inadequate. Both problems of indeterminate uncertainty have solutions under the proposed framework. The framework is based on an information-theoretic definition of an incoherence function to be minimized. It generalizes the principle of choosing an estimate that minimizes the reverse relative entropy between it and a previous posterior distribution such as a confidence distribution. The simplest form of the incoherence function, called the incoherence distribution, is a min-plus probability distribution, which is equivalent to a possibility distribution rather than a measure-theoretic probability distribution. A simple case of minimizing the incoherence leads to a generalization of minimizing relative entropy and thus of maximizing entropy. The framework of minimum incoherence is applied to problems of Bayesian-confidence uncertainty and to parallel problems of indeterminate uncertainty about model revision.(c) 2022 Elsevier B.V. All rights reserved.
更多
查看译文
关键词
Bayes-frequentist continuum,Bayesian model checking,Blended inference,Coding theory,Fiducial inference,Information theory,Kullback-Leibler divergence,Possibility theory
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要