Translator Disclaimer
May 2013 Variational Inference for Generalized Linear Mixed Models Using Partially Noncentered Parametrizations
Linda S. L. Tan, David J. Nott
Statist. Sci. 28(2): 168-188 (May 2013). DOI: 10.1214/13-STS418


The effects of different parametrizations on the convergence of Bayesian computational algorithms for hierarchical models are well explored. Techniques such as centering, noncentering and partial noncentering can be used to accelerate convergence in MCMC and EM algorithms but are still not well studied for variational Bayes (VB) methods. As a fast deterministic approach to posterior approximation, VB is attracting increasing interest due to its suitability for large high-dimensional data. Use of different parametrizations for VB has not only computational but also statistical implications, as different parametrizations are associated with different factorized posterior approximations. We examine the use of partially noncentered parametrizations in VB for generalized linear mixed models (GLMMs). Our paper makes four contributions. First, we show how to implement an algorithm called nonconjugate variational message passing for GLMMs. Second, we show that the partially noncentered parametrization can adapt to the quantity of information in the data and determine a parametrization close to optimal. Third, we show that partial noncentering can accelerate convergence and produce more accurate posterior approximations than centering or noncentering. Finally, we demonstrate how the variational lower bound, produced as part of the computation, can be useful for model selection.


Download Citation

Linda S. L. Tan. David J. Nott. "Variational Inference for Generalized Linear Mixed Models Using Partially Noncentered Parametrizations." Statist. Sci. 28 (2) 168 - 188, May 2013.


Published: May 2013
First available in Project Euclid: 21 May 2013

zbMATH: 1331.62167
MathSciNet: MR3112404
Digital Object Identifier: 10.1214/13-STS418

Rights: Copyright © 2013 Institute of Mathematical Statistics


Vol.28 • No. 2 • May 2013
Back to Top