Journal of Applied Probability

Coupling and ergodicity of adaptive Markov chain Monte Carlo algorithms

Gareth O. Roberts and Jeffrey S. Rosenthal

Full-text: Access denied (no subscription detected) We're sorry, but we are unable to provide you with the full text of this article because we are not able to identify you as a subscriber. If you have a personal subscription to this journal, then please login. If you are already logged in, then you may need to update your profile to register your subscription. Read more about accessing full-text

Abstract

We consider basic ergodicity properties of adaptive Markov chain Monte Carlo algorithms under minimal assumptions, using coupling constructions. We prove convergence in distribution and a weak law of large numbers. We also give counterexamples to demonstrate that the assumptions we make are not redundant.

Article information

Source
J. Appl. Probab. Volume 44, Number 2 (2007), 458-475.

Dates
First available in Project Euclid: 5 July 2007

Permanent link to this document
http://projecteuclid.org/euclid.jap/1183667414

Digital Object Identifier
doi:10.1239/jap/1183667414

Mathematical Reviews number (MathSciNet)
MR2340211

Zentralblatt MATH identifier
1137.62015

Subjects
Primary: 60J10: Markov chains (discrete-time Markov processes on discrete state spaces)
Secondary: 60J22: Computational methods in Markov chains [See also 65C40] 65C40: Computational Markov chains

Keywords
Markov chains computational methods

Citation

Roberts, Gareth O.; Rosenthal, Jeffrey S. Coupling and ergodicity of adaptive Markov chain Monte Carlo algorithms. J. Appl. Probab. 44 (2007), no. 2, 458--475. doi:10.1239/jap/1183667414. http://projecteuclid.org/euclid.jap/1183667414.


Export citation

References

  • Andrieu, C. and Moulines, E. (2006). On the ergodicity properties of some adaptive Markov chain Monte Carlo algorithms. Ann. Appl. Prob. 16, 1462--1505.
  • Andrieu, C. and Robert, C. P. (2002). Controlled MCMC for optimal sampling. Preprint.
  • Atchadé, Y. F. and Rosenthal, J. S. (2005). On adaptive Markov chain Monte Carlo algorithms. Bernoulli 11, 815--828.
  • Baxendale, P. H. (2005). Renewal theory and computable convergence rates for geometrically ergodic Markov chains. Ann. Appl. Prob. 15, 700--738.
  • Bédard, M. (2006). On the robustness of optimal scaling for Metropolis--Hastings algorithms. Doctoral Thesis, University of Toronto.
  • Brockwell, A. E. and Kadane, J. B. (2005). Identification of regeneration times in MCMC simulation, with application to adaptive schemes. J. Comput. Graph. Statist. 14, 436--458.
  • Fort, G. and Moulines, E. (2000). Computable bounds for subgeometrical and geometrical ergodicity. Preprint. Available at http://citeseer.ist.psu.edu/fort00computable.html.
  • Fort, G. and Moulines, E. (2003). Polynomial ergodicity of Markov transition kernels. Stoch. Process. Appl. 103, 57--99.
  • Fristedt, B. and Gray, L. (1997). A Modern Approach to Probability Theory. Birkhäuser, Boston, MA.
  • Gilks, W. R., Roberts, G. O. and Sahu, S. K. (1998). Adaptive Markov chain Monte Carlo. J. Amer. Statist. Assoc. 93, 1045--1054.
  • Haario, H., Saksman, E. and Tamminen, J. (2001). An adaptive Metropolis algorithm. Bernoulli 7, 223--242.
  • Häggström, O. (2001). A note on disagreement percolation. Random Structures Algorithms 18, 267--278.
  • Jarner, S. F. and Roberts, G. O. (2002). Polynomial convergence rates of Markov chains. Ann. Appl. Prob. 12, 224--247.
  • Meyn, S. P. and Tweedie, R. L. (1993). Markov Chains and Stochastic Stability. Springer, London.
  • Meyn, S. P. and Tweedie, R. L. (1994). Computable bounds for convergence rates of Markov chains. Ann. Appl. Prob. 4, 981--1011.
  • Pasarica, C. and Gelman, A. (2003). Adaptively scaling the Metropolis algorithm using the average squared jumped distance. Preprint.
  • Pemantle, R. and Rosenthal, J. S. (1999). Moment conditions for a sequence with negative drift to be uniformly bounded in $L^r$. Stoch. Process. Appl. 82, 143--155.
  • Robbins, H. and Monro, S. (1951). A stochastic approximation method. Ann. Math. Statist. 22, 400--407.
  • Roberts, G. O. and Rosenthal, J. S. (2001). Optimal scaling for various Metropolis--Hastings algorithms. Statist. Sci. 16, 351--367.
  • Roberts, G. O. and Rosenthal, J. S. (2002). One-shot coupling for certain stochastic recursive sequences. Stoch. Process. Appl. 99, 195--208.
  • Roberts, G. O. and Rosenthal, J. S. (2004). General state space Markov chains and MCMC algorithms. Prob. Surveys 1, 20--71.
  • Roberts, G. O. and Tweedie, R. L. (1999). Bounds on regeneration times and convergence rates for Markov chains. Stoch. Process. Appl. 80, 211--229. (Correction: 91 (2001), 337--338.)
  • Roberts, G. O., Gelman, A. and Gilks, W. R. (1997). Weak convergence and optimal scaling of random walk Metropolis algorithms. Ann. Appl. Prob. 7, 110--120.
  • Roberts, G. O., Rosenthal, J. S. and Schwartz, P. O. (1998). Convergence properties of perturbed Markov chains. J. Appl. Prob. 35, 1--11.
  • Rosenthal, J. S. (1995). Minorization conditions and convergence rates for Markov chain Monte Carlo. J. Amer. Statist. Assoc. 90, 558--566.
  • Rosenthal, J. S. (1997). Faithful couplings of Markov chains: now equals forever. Adv. Appl. Math. 18, 372--381.
  • Rosenthal, J. S. (2000). A First Look at Rigorous Probability Theory. World Scientific, Singapore.
  • Rosenthal, J. S. (2002). Quantitative convergence rates of Markov chains: a simple account. Electron. Commun. Prob. 7, 123--128.
  • Rosenthal, J. S. (2004). Adaptive MCMC Java applet. Available at http://probability.ca/jeff/java/adapt.html.
  • Tierney, L. (1994). Markov chains for exploring posterior distributions (with discussion). Ann. Statist. 22, 1701--1762.