Select Lab Publications

Learning Thin Junction Trees via Graph Cuts (2009)

By: Dafna Shahaf, Anton Chechetka, and Carlos Guestrin

Abstract: Structure learning algorithms usually focus on the compactness of the learned model. However, for general compact models, both exact and approximate inference are still NP-hard. Therefore, the focus only on compactness leads to learning models that require approximate inference techniques, thus reducing their prediction quality. In this paper, we propose a method for learning an attractive class of models: bounded-treewidth junction trees, which permit both compact representation of probability distributions and efficient exact inference.

Using Bethe approximation of the likelihood, we transform the problem of finding a good junction tree separator into a minimum cut problem on a weighted graph. Using the graph cut intuition, we present an efficient algorithm with theoretical guarantees for finding good separators, which we recursively apply to obtain a thin junction tree. Our extensive empirical evaluation demonstrates the benefit of applying exact inference using our models to answer queries. We also extend our technique to learning low tree-width conditional random fields, and demonstrate significant improvements over state of the art block-L1 regularization techniques.

Download Information
Dafna Shahaf, Anton Chechetka, and Carlos Guestrin (2009). "Learning Thin Junction Trees via Graph Cuts." In Artificial Intelligence and Statistics (AISTATS). pdf   talk      
BibTeX citation

title = {Learning Thin Junction Trees via Graph Cuts},
author = {Dafna Shahaf and Anton Chechetka and Carlos Guestrin},
booktitle = {In Artificial Intelligence and Statistics (AISTATS)},
year = 2009,
month = {April},
address = {Clearwater Beach, Florida},
wwwfilebase = {aistats2009-shahaf-chechetka-guestrin},
wwwtopic = {Structure Learning},

full list