Open Access
2018 Consistency of variational Bayes inference for estimation and model selection in mixtures
Badr-Eddine Chérief-Abdellatif, Pierre Alquier
Electron. J. Statist. 12(2): 2995-3035 (2018). DOI: 10.1214/18-EJS1475


Mixture models are widely used in Bayesian statistics and machine learning, in particular in computational biology, natural language processing and many other fields. Variational inference, a technique for approximating intractable posteriors thanks to optimization algorithms, is extremely popular in practice when dealing with complex models such as mixtures. The contribution of this paper is two-fold. First, we study the concentration of variational approximations of posteriors, which is still an open problem for general mixtures, and we derive consistency and rates of convergence. We also tackle the problem of model selection for the number of components: we study the approach already used in practice, which consists in maximizing a numerical criterion (the Evidence Lower Bound). We prove that this strategy indeed leads to strong oracle inequalities. We illustrate our theoretical results by applications to Gaussian and multinomial mixtures.


Download Citation

Badr-Eddine Chérief-Abdellatif. Pierre Alquier. "Consistency of variational Bayes inference for estimation and model selection in mixtures." Electron. J. Statist. 12 (2) 2995 - 3035, 2018.


Received: 1 May 2018; Published: 2018
First available in Project Euclid: 19 September 2018

zbMATH: 06942964
MathSciNet: MR3855643
Digital Object Identifier: 10.1214/18-EJS1475

Primary: 62F12
Secondary: 62F15 , 62F35 , 65C60

Keywords: frequentist evaluation of Bayesian methods , Mixture models , Model selection , variational approximations

Vol.12 • No. 2 • 2018
Back to Top