Open Access
2016 Simple approximate MAP inference for Dirichlet processes mixtures
Yordan P. Raykov, Alexis Boukouvalas, Max A. Little
Electron. J. Statist. 10(2): 3548-3578 (2016). DOI: 10.1214/16-EJS1196

Abstract

The Dirichlet process mixture model (DPMM) is a ubiquitous, flexible Bayesian nonparametric statistical model. However, full probabilistic inference in this model is analytically intractable, so that computationally intensive techniques such as Gibbs sampling are required. As a result, DPMM-based methods, which have considerable potential, are restricted to applications in which computational resources and time for inference is plentiful. For example, they would not be practical for digital signal processing on embedded hardware, where computational resources are at a serious premium. Here, we develop a simplified yet statistically rigorous approximate maximum a-posteriori (MAP) inference algorithm for DPMMs. This algorithm is as simple as DP-means clustering, solves the MAP problem as well as Gibbs sampling, while requiring only a fraction of the computational effort. (For freely available code that implements the MAP-DP algorithm for Gaussian mixtures see http://www.maxlittle.net/.) Unlike related small variance asymptotics (SVA), our method is non-degenerate and so inherits the “rich get richer” property of the Dirichlet process. It also retains a non-degenerate closed-form likelihood which enables out-of-sample calculations and the use of standard tools such as cross-validation. We illustrate the benefits of our algorithm on a range of examples and contrast it to variational, SVA and sampling approaches from both a computational complexity perspective as well as in terms of clustering performance. We demonstrate the wide applicabiity of our approach by presenting an approximate MAP inference method for the infinite hidden Markov model whose performance contrasts favorably with a recently proposed hybrid SVA approach. Similarly, we show how our algorithm can applied to a semiparametric mixed-effects regression model where the random effects distribution is modelled using an infinite mixture model, as used in longitudinal progression modelling in population health science. Finally, we propose directions for future research on approximate MAP inference in Bayesian nonparametrics.

Citation

Download Citation

Yordan P. Raykov. Alexis Boukouvalas. Max A. Little. "Simple approximate MAP inference for Dirichlet processes mixtures." Electron. J. Statist. 10 (2) 3548 - 3578, 2016. https://doi.org/10.1214/16-EJS1196

Information

Received: 1 May 2016; Published: 2016
First available in Project Euclid: 16 November 2016

zbMATH: 1357.62227
MathSciNet: MR3572859
Digital Object Identifier: 10.1214/16-EJS1196

Subjects:
Primary: 62F15
Secondary: 62G86

Keywords: Bayesian nonparametrics , clustering , Gaussian mixture model

Rights: Copyright © 2016 The Institute of Mathematical Statistics and the Bernoulli Society

Vol.10 • No. 2 • 2016
Back to Top