Open Access
Translator Disclaimer
2014 Variational Bayesian inference with Gaussian-mixture approximations
O. Zobay
Electron. J. Statist. 8(1): 355-389 (2014). DOI: 10.1214/14-EJS887


Variational Bayesian inference with a Gaussian posterior approximation provides an alternative to the more commonly employed factorization approach and enlarges the range of tractable distributions. In this paper, we propose an extension to the Gaussian approach which uses Gaussian mixtures as approximations. A general problem for variational inference with mixtures is posed by the calculation of the entropy term in the Kullback-Leibler distance, which becomes analytically intractable. We deal with this problem by using a simple lower bound for the entropy and imposing restrictions on the form of the Gaussian covariance matrix. In this way, efficient numerical calculations become possible. To illustrate the method, we discuss its application to an isotropic generalized normal target density, a non-Gaussian state space model, and the Bayesian lasso. For heavy-tailed distributions, the examples show that the mixture approach indeed leads to improved approximations in the sense of a reduced Kullback-Leibler distance. From a more practical point of view, mixtures can improve estimates of posterior marginal variances. Furthermore, they provide an initial estimate of posterior skewness which is not possible with single Gaussians. We also discuss general sufficient conditions under which mixtures are guaranteed to provide improvements over single-component approximations.


Download Citation

O. Zobay. "Variational Bayesian inference with Gaussian-mixture approximations." Electron. J. Statist. 8 (1) 355 - 389, 2014.


Published: 2014
First available in Project Euclid: 18 April 2014

zbMATH: 1294.62053
MathSciNet: MR3195120
Digital Object Identifier: 10.1214/14-EJS887

Primary: 62F15
Secondary: 62E17

Rights: Copyright © 2014 The Institute of Mathematical Statistics and the Bernoulli Society


Vol.8 • No. 1 • 2014
Back to Top