Select Lab Publications


Efficient Principled Learning of Thin Junction Trees (2007)

By: Anton Chechetka and Carlos Guestrin

Abstract: We present the first truly polynomial algorithm for PAC-learning the structure of bounded-treewidth junction trees - an attractive subclass of probabilistic graphical models that permits both the compact representation of probability distributions and efficient exact inference. For a constant treewidth, our algorithm has polynomial time and sample complexity. If a junction tree with sufficiently strong intra-clique dependencies exists, we provide strong theoretical guarantees in terms of KL divergence of the result from the true distribution. We also present a lazy extension of our approach that leads to very significant speed ups in practice, and demonstrate the viability of our method empirically, on several real world datasets.

One of our key new theoretical insights is a method for bounding the conditional mutual information of arbitrarily large sets of variables with only polynomially many mutual information computations on fixed-size subsets of variables, if the underlying distribution can be approximated by a bounded-treewidth junction tree.

Download Information
Anton Chechetka and Carlos Guestrin (2007). "Efficient Principled Learning of Thin Junction Trees." In Advances in Neural Information Processing Systems (NIPS). pdf     poster  
BibTeX citation

@inproceedings{Chechetka+Guestrin:nips07tjtpac,
title = {Efficient Principled Learning of Thin Junction Trees},
author = {Anton Chechetka and Carlos Guestrin},
booktitle = {In Advances in Neural Information Processing Systems (NIPS)},
year = 2007,
month = {December},
address = {Vancouver, Canada},
wwwfilebase = {nips2007-chechetka-guestrin},
wwwtopic = {Structure Learning}
}



full list