Model Selection for Mixtures of Factor Analyzers via Hierarchical BIC

semanticscholar(2013)

引用 5|浏览0
暂无评分
摘要
Bayesian information criterion (BIC) is a common model selection criterion for mixtures of factor analyzers (MFA). However, it is found that BIC penalizes each factor analyzer implausibly using the whole sample size. In this paper, we propose a new criterion for MFA called hierarchical BIC (H-BIC). Formally, the main difference from BIC is that H-BIC penalizes each factor analyzer using its own effective sample size only. Theoretically, we show that HBIC is a large sample approximation of variational Bayesian (VB) lower bound and BIC is a further approximation of HBIC. Additionally, to perform H-BIC efficiently, we propose a novel algorithm in which we does not use H-BIC as a criterion to choose one from a set of candidate models with different latent dimensions, rather, we integrates the determination of latent dimensions into parameter estimation for a given number of components. Consequently, this algorithm only requires choosing one from the much smaller set with different number of components. Experiments on a number of synthetic and real data sets reveal that (i) H-BIC is more accurate than BIC and several existing competing methods; (ii) the proposed novel algorithm is much more efficient than that usually used for BIC.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要