## Annals of Statistics

### A theoretical comparison of the data augmentation, marginal augmentation and PX-DA algorithms

#### Abstract

The data augmentation (DA) algorithm is a widely used Markov chain Monte Carlo (MCMC) algorithm that is based on a Markov transition density of the form $p(x|x')=\int_{\mathsf{Y}}f_{X|Y}(x|y)f_{Y|X}(y|x')\,dy$, where fX|Y and fY|X are conditional densities. The PX-DA and marginal augmentation algorithms of Liu and Wu [J. Amer. Statist. Assoc. 94 (1999) 1264–1274] and Meng and van Dyk [Biometrika 86 (1999) 301–320] are alternatives to DA that often converge much faster and are only slightly more computationally demanding. The transition densities of these alternative algorithms can be written in the form $p_{R}(x|x')=\int_{\mathsf{Y}}\int _{\mathsf{Y}}f_{X|Y}(x|y')R(y,dy')f_{Y|X}(y|x')\,dy$, where R is a Markov transition function on $\mathsf{Y}$. We prove that when R satisfies certain conditions, the MCMC algorithm driven by pR is at least as good as that driven by p in terms of performance in the central limit theorem and in the operator norm sense. These results are brought to bear on a theoretical comparison of the DA, PX-DA and marginal augmentation algorithms. Our focus is on situations where the group structure exploited by Liu and Wu is available. We show that the PX-DA algorithm based on Haar measure is at least as good as any PX-DA algorithm constructed using a proper prior on the group.

#### Article information

Source
Ann. Statist., Volume 36, Number 2 (2008), 532-554.

Dates
First available in Project Euclid: 13 March 2008

https://projecteuclid.org/euclid.aos/1205420510

Digital Object Identifier
doi:10.1214/009053607000000569

Mathematical Reviews number (MathSciNet)
MR2396806

Zentralblatt MATH identifier
1155.60031

Subjects
Primary: 60J27: Continuous-time Markov processes on discrete state spaces
Secondary: 62F15: Bayesian inference

#### Citation

Hobert, James P.; Marchev, Dobrin. A theoretical comparison of the data augmentation, marginal augmentation and PX-DA algorithms. Ann. Statist. 36 (2008), no. 2, 532--554. doi:10.1214/009053607000000569. https://projecteuclid.org/euclid.aos/1205420510

#### References

• Albert, J. H. and Chib, S. (1993). Bayesian analysis of binary and polychotomous response data. J. Amer. Statist. Assoc. 88 669–679.
• Amit, Y. (1991). On rates of convergence of stochastic relaxation for Gaussian and non-Gaussian distributions. J. Multivariate Anal. 38 82–99.
• Eaton, M. L. (1989). Group Invariance Applications in Statistics. Institute of Mathematical Statistics and the American Statistical Association, Hayward, California and Alexandria, Virginia.
• Fremlin, D. H. (2003). Measure Theory: Topological Measure Spaces. Torres Fremlin. Available at http://www.essex.ac.uk/maths/staff/fremlin/mt.htm.
• Geyer, C. J. (1992). Practical Markov chain Monte Carlo (with discussion). Statist. Sci. 7 473–511.
• Hobert, J. P. (2001). Stability relationships among the Gibbs sampler and its subchains. J. Comput. Graph. Statist. 10 185–205.
• Hobert, J. P., Jones, G. L., Presnell, B. and Rosenthal, J. S. (2002). On the applicability of regenerative simulation in Markov chain Monte Carlo. Biometrika 89 731–743.
• Jones, G. L., Haran, M., Caffo, B. S. and Neath, R. (2006). Fixed-width output analysis for Markov chain Monte Carlo. J. Amer. Statist. Assoc. 101 1537–1547.
• Jones, G. L. and Hobert, J. P. (2001). Honest exploration of intractable probability distributions via Markov chain Monte Carlo. Statist. Sci. 16 312–334.
• Liu, J. S. and Sabatti, C. (2000). Generalised Gibbs sampler and multigrid Monte Carlo for Bayesian computation. Biometrika 87 353–369.
• Liu, J. S., Wong, W. H. and Kong, A. (1994). Covariance structure of the Gibbs sampler with applications to comparisons of estimators and augmentation schemes. Biometrika 81 27–40.
• Liu, J. S., Wong, W. H. and Kong, A. (1995). Covariance structure and convergence rate of the Gibbs sampler with various scans. J. Roy. Statist. Soc. Ser. B 57 157–169.
• Liu, J. S. and Wu, Y. N. (1999). Parameter expansion for data augmentation. J. Amer. Statist. Assoc. 94 1264–1274.
• Meng, X.-L. and van Dyk, D. A. (1999). Seeking efficient data augmentation schemes via conditional and marginal augmentation. Biometrika 86 301–320.
• Meyn, S. P. and Tweedie, R. L. (1993). Markov Chains and Stochastic Stability. Springer, London.
• Mira, A. and Geyer, C. J. (1999). Ordering Monte Carlo Markov chains. Technical Report 632, School of Statistics, Univ. Minnesota.
• Roberts, G. O. and Rosenthal, J. S. (1997). Geometric ergodicity and hybrid Markov chains. Electron. Comm. Probab. 2 13–25.
• Roberts, G. O. and Rosenthal, J. S. (2004). General state space Markov chains and MCMC algorithms. Probab. Surveys 1 20–71.
• Roberts, G. O. and Rosenthal, J. S. (2008). Variance bounding Markov chains. Ann. Appl. Probab. To appear.
• Roberts, G. O. and Tweedie, R. L. (2001). Geometric L2 and L1 convergence are equivalent for reversible Markov chains. J. Appl. Probab. 38A 37–41.
• Roy, V. and Hobert, J. P. (2007). Convergence rates and asymptotic standard errors for MCMC algorithms for Bayesian probit regression. J. Roy. Statist. Soc. Ser. B 69 607–623.
• Tanner, M. A. and Wong, W. H. (1987). The calculation of posterior distributions by data augmentation (with discussion). J. Amer. Statist. Assoc. 82 528–550.
• van Dyk, D. A. and Meng, X.-L. (2001). The art of data augmentation (with discussion). J. Comput. Graph. Statist. 10 1–50.
• Wijsman, R. A. (1990). Invariant Measures on Groups and Their Use in Statistics. Institute of Mathematical Statistics and the American Statistical Association, Hayward, California and Alexandria, Virginia.