Open Access
March 2006 Variational inference for Dirichlet process mixtures
David M. Blei, Michael I. Jordan
Bayesian Anal. 1(1): 121-143 (March 2006). DOI: 10.1214/06-BA104

Abstract

Dirichlet process (DP) mixture models are the cornerstone of nonparametric Bayesian statistics, and the development of Monte-Carlo Markov chain (MCMC) sampling methods for DP mixtures has enabled the application of nonparametric Bayesian methods to a variety of practical data analysis problems. However, MCMC sampling can be prohibitively slow, and it is important to explore alternatives. One class of alternatives is provided by variational methods, a class of deterministic algorithms that convert inference problems into optimization problems (Opper and Saad 2001; Wainwright and Jordan 2003). Thus far, variational methods have mainly been explored in the parametric setting, in particular within the formalism of the exponential family (Attias 2000; Ghahramani and Beal 2001; Blei et al. 2003). In this paper, we present a variational inference algorithm for DP mixtures. We present experiments that compare the algorithm to Gibbs sampling algorithms for DP mixtures of Gaussians and present an application to a large-scale image analysis problem.

Citation

Download Citation

David M. Blei. Michael I. Jordan. "Variational inference for Dirichlet process mixtures." Bayesian Anal. 1 (1) 121 - 143, March 2006. https://doi.org/10.1214/06-BA104

Information

Published: March 2006
First available in Project Euclid: 22 June 2012

zbMATH: 1331.62259
MathSciNet: MR2227367
Digital Object Identifier: 10.1214/06-BA104

Keywords: Bayesian computation , Dirichlet processes , hierarchical models , image processing , variational inference

Rights: Copyright © 2006 International Society for Bayesian Analysis

Vol.1 • No. 1 • March 2006
Back to Top