A Comparison of Two Methods for Obtaining a Collective Posterior Distribution

Springer Proceedings in Mathematics & Statistics(2018)

引用 0|浏览6
暂无评分
摘要
Bayesian inference is a powerful method that allows individuals to update their knowledge about any phenomenon when more information about it becomes available. In this paradigm, before data is observed, an individual expresses his uncertainty about the phenomenon of interest through a prior probability distribution. Then, after data is observed, this distribution is updated using Bayes theorem. In many situations, however, one desires to evaluate the knowledge of a group rather than of a single individual. In this case, a way to combine information from different sources is by mixing their uncertainty. The mixture can be done in two ways: before or after the data is observed. Although in both cases, we achieve a collective posterior distribution, they can be substantially different. In this work, we present several comparisons between these two approaches with noninformative priors and use the Kullback-Leibler's divergence to quantify the amount of information that is gained by each collective distribution.
更多
查看译文
关键词
Collective posterior distributions,Mixing prior distributions,Mixing posterior distributions,Group decision making,Bayesian inference
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要