Learning Thin Junction Trees via Graph Cuts

AISTATS(2009)

引用 57|浏览40
暂无评分
摘要
Structure learning algorithms usually focus on the compactness of the learned model. However, for general compact models, both exact and ap- proximate inference are still NP-hard. Therefore, the focus only on compactness leads to learning models that require approximate inference tech- niques, thus reducing their prediction quality. In this paper, we propose a method for learning an attractive class of models: bounded-treewidth junction trees, which permit both compact repre- sentation of probability distributions and efficient exact inference. Using Bethe approximation of the likelihood, we transform the problem of finding a good junction tree separator into a minimum cut problem on a weighted graph. Using the graph cut intuition, we present an efficient algorithm with theoretical guarantees for finding good separators, which we recursively apply to obtain a thin junction tree. Our extensive empirical evaluation demonstrates the benefit of applying exact inference using our models to answer queries. We also extend our technique to learning low tree-width conditional random fields, and demonstrate significant im- provements over state of the art block-L1 regu- larization techniques.
更多
查看译文
关键词
minimum cut,graph cut,conditional random field,probability distribution
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要