Open Access
2018 Convergence analysis of the block Gibbs sampler for Bayesian probit linear mixed models with improper priors
Xin Wang, Vivekananda Roy
Electron. J. Statist. 12(2): 4412-4439 (2018). DOI: 10.1214/18-EJS1506

Abstract

In this article, we consider Markov chain Monte Carlo (MCMC) algorithms for exploring the intractable posterior density associated with Bayesian probit linear mixed models under improper priors on the regression coefficients and variance components. In particular, we construct a two-block Gibbs sampler using the data augmentation (DA) techniques. Furthermore, we prove geometric ergodicity of the Gibbs sampler, which is the foundation for building central limit theorems for MCMC based estimators and subsequent inferences. The conditions for geometric convergence are similar to those guaranteeing posterior propriety. We also provide conditions for the propriety of posterior distributions with a general link function when the design matrices take commonly observed forms. In general, the Haar parameter expansion for DA (PX-DA) algorithm is an improvement of the DA algorithm and it has been shown that it is theoretically at least as good as the DA algorithm. Here we construct a Haar PX-DA algorithm, which has essentially the same computational cost as the two-block Gibbs sampler.

Citation

Download Citation

Xin Wang. Vivekananda Roy. "Convergence analysis of the block Gibbs sampler for Bayesian probit linear mixed models with improper priors." Electron. J. Statist. 12 (2) 4412 - 4439, 2018. https://doi.org/10.1214/18-EJS1506

Information

Received: 1 November 2017; Published: 2018
First available in Project Euclid: 18 December 2018

zbMATH: 07003247
MathSciNet: MR3892344
Digital Object Identifier: 10.1214/18-EJS1506

Subjects:
Primary: 60J05
Secondary: 62F15

Keywords: Data augmentation , drift condition , geometric ergodicity , GLMM , Haar PX-DA algorithm , Markov chains , posterior propriety

Vol.12 • No. 2 • 2018
Back to Top