Papers
Topics
Authors
Recent
Search
2000 character limit reached

Bayesian sparse graphical models and their mixtures using lasso selection priors

Published 3 Oct 2013 in stat.ME | (1310.1127v1)

Abstract: We propose Bayesian methods for Gaussian graphical models that lead to sparse and adaptively shrunk estimators of the precision (inverse covariance) matrix. Our methods are based on lasso-type regularization priors leading to parsimonious parameterization of the precision matrix, which is essential in several applications involving learning relationships among the variables. In this context, we introduce a novel type of selection prior that develops a sparse structure on the precision matrix by making most of the elements exactly zero, in addition to ensuring positive definiteness -- thus conducting model selection and estimation simultaneously. We extend these methods to finite and infinite mixtures of Gaussian graphical models for clustered data using Dirichlet process priors. We discuss appropriate posterior simulation schemes to implement posterior inference in the proposed models, including the evaluation of normalizing constants that are functions of parameters of interest which result from the restrictions on the correlation matrix. We evaluate the operating characteristics of our method via several simulations and in application to real data sets.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.