UPDF AI

Scaling It Up: Stochastic Search Structure Learning in Graphical Models

Hao Wang

2015 · DOI: 10.1214/14-BA916
120 citations

TLDR

A new framework for structure learning is proposed that is based on continuous spike and slab priors and uses latent variables to identify graphs and efficiently handles problems with hundreds of variables.

Résumé

Gaussian concentration graph models and covariance graph models are two classes of graphical models that are useful for uncovering latent dependence structures among multivariate variables. In the Bayesian literature, graphs are often determined through the use of priors over the space of positive definite matrices with fixed zeros, but these methods present daunting computational burdens in large problems. Motivated by the superior computational efficiency of continuous shrinkage priors for regression analysis, we propose a new framework for structure learning that is based on continuous spike and slab priors and uses latent variables to identify graphs. We discuss model specification, computation, and inference for both concentration and covariance graph models. The new approach produces reliable estimates of graphs and efficiently handles problems with hundreds of variables.