Probability Surveys

General state space Markov chains and MCMC algorithms

Gareth O. Roberts and Jeffrey S. Rosenthal

Full-text: Open access


This paper surveys various results about Markov chains on general (non-countable) state spaces. It begins with an introduction to Markov chain Monte Carlo (MCMC) algorithms, which provide the motivation and context for the theory which follows. Then, sufficient conditions for geometric and uniform ergodicity are presented, along with quantitative bounds on the rate of convergence to stationarity. Many of these results are proved using direct coupling constructions based on minorisation and drift conditions. Necessary and sufficient conditions for Central Limit Theorems (CLTs) are also presented, in some cases proved via the Poisson Equation or direct regeneration constructions. Finally, optimal scaling and weak convergence results for Metropolis-Hastings algorithms are discussed. None of the results presented is new, though many of the proofs are. We also describe some Open Problems.

Article information

Probab. Surveys Volume 1 (2004), 20-71.

First available in Project Euclid: 8 November 2004

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier


Roberts, Gareth O.; Rosenthal, Jeffrey S. General state space Markov chains and MCMC algorithms. Probab. Surveys 1 (2004), 20--71. doi:10.1214/154957804100000024.

Export citation


  • D.J. Aldous, (1983), Random walk on finite groups and rapidly mixing Markov chains. Séminaire de Probabilités XVII. Lecture Notes in Math. 986, 243--297. Springer, New York.
  • D.J. Aldous and P. Diaconis (1987), Strong stopping times and finite random walks. Adv. Appl. Math. 8, 69--97.
  • D.J. Aldous and H. Thorisson (1993), Shift-coupling. Stoch. Proc. Appl. 44, 1--14.
  • S. Asmussen (1987), Applied Probability and Queues. John Wiley & Sons, New York.
  • Y.F. Atchadé and J.S. Rosenthal (2003), On Adaptive Markov Chain Monte Carlo Algorithms. Submitted.
  • Y.F. Atchadé and J.S. Rosenthal (2003), Central Limit Theorems for Geometrically Ergodic Markov chains. Work in progress.
  • K.B. Athreya, H. Doss, and J. Sethuraman (1996), On the Convergence of the Markov Chain Simulation Method. Ann. Stat. 24, 69--100.
  • K.B. Athreya and P. Ney (1978), A new approach to the limit theory of recurrent Markov chains. Trans. Amer. Math. Soc. 245, 493--501.
  • P.H. Baxendale (2003), Renewal theory and computable convergence rates for geometrically ergodic Markov chains. Preprint, University of Southern California.
  • M. Bédard (2004), On the robustness of optimal scaling for Metropolis-Hastings algorithms. Ph.D. dissertation, University of Toronto. Work in progress.
  • K. Berenhaut and R.B. Lund (2001), Geometric renewal convergence rates from hazard rates. J. Appl. Prob. 38, 180--194.
  • P. Billingsley (1961), The Lindeberg-Lévy theorem for martingales. Proc. Amer. Math. Soc. 12, 788--792.
  • P. Billingsley (1995), Probability and Measure, $3^\rm rd$ ed. John Wiley & Sons, New York.
  • L. Breyer and G.O. Roberts (2000), From Metropolis to diffusions: Gibbs states and optimal scaling. Stoch. Proc. Appl. 90, 181--206.
  • K.S. Chan and C.J. Geyer (1994), Discussion to reference [tierney?]. Ann. Stat. 22, 1747--1758.
  • O.F. Christensen, G.O. Roberts, and J.S. Rosenthal (2003), Scaling Limits for the Transient Phase of Local Metropolis-Hastings Algorithms. Submitted.
  • R. Cogburn (1972), The central limit theorem for Markov processes. In Proc. Sixth Berkeley Symp. Math. Statist. Probab. 2, 485--512.
  • M.K. Cowles and B.P. Carlin (1996), Markov Chain Monte Carlo Convergence Diagnostics: A Comparative Review. J. Amer. Stat. Assoc. 91, 883--904.
  • M.K. Cowles, G.O. Roberts, and J.S. Rosenthal (1999), Possible biases induced by MCMC convergence diagnostics. J. Stat. Comp. Sim. 64, 87--104.
  • M.K. Cowles and J.S. Rosenthal (1998), A simulation approach to convergence rates for Markov chain Monte Carlo algorithms. Statistics and Computing 8, 115--124.
  • P. Diaconis (1988), Group representations in Probability and Statistics. Institute of Mathematical Statistics, Hayward, California.
  • W. Doeblin (1938), Exposé de la théorie des châines simples constantes de Markov à un nombre fini d'états. Revue Mathematique de l'Union Interbalkanique 2, 77--105.
  • J.I. Doob (1953), Stochastic Processes. Wiley, New York.
  • R. Douc, E. Moulines, and J.S. Rosenthal (2002), Quantitative bounds on convergence of time-inhomogeneous Markov Chains. Ann. Appl. Prob., to appear.
  • R. Durrett (1991), Probability: theory and examples. Wadsworth, Pacific Grove, California.
  • S.N. Ethier and T.G. Kurtz (1986), Markov processes, characterization and convergence. Wiley, New York.
  • J.A. Fill, M. Machida, D.J. Murdoch, and J.S. Rosenthal (2000), Extension of Fill's perfect rejection sampling algorithm to general chains. Random Struct. Alg. 17, 290--316.
  • G. Fort (2003), Computable bounds for V-geometric ergodicity of Markov transition kernels. Preprint, Université Joseph Fourier, Grenoble, France.
  • G. Fort and E. Moulines (2003), Polynomial ergodicity of Markov transition kernels. Stoch. Proc. Appl. 103, 57--99.
  • A.E. Gelfand and A.F.M. Smith (1990), Sampling based approaches to calculating marginal densities. J. Amer. Stat. Assoc. 85, 398--409.
  • A. Gelman and D.B. Rubin (1992), Inference from iterative simulation using multiple sequences. Stat. Sci., Vol. 7, No. 4, 457-472.
  • S. Geman and D. Geman (1984), Stochastic relaxation, Gibbs distributions and the Bayesian restoration of images. IEEE Trans. on pattern analysis and machine intelligence 6, 721--741.
  • W.R. Gilks, S. Richardson, and D.J. Spiegelhalter, ed. (1996), Markov chain Monte Carlo in practice. Chapman and Hall, London.
  • C. Geyer (1992), Practical Markov chain Monte Carlo. Stat. Sci., Vol. 7, No. 4, 473-483.
  • W.R. Gilks, G.O. Roberts and S.K. Sahu (1998), Adaptive Markov Chain Monte Carlo, J. Am. Stat. Assoc., 93, 1045--1054.
  • O. Häggström (2004), On the central limit theorem for geometrically ergodic Markov chains. Prob. Th. Rel. Fields, to appear. (Available at$\sim$olleh/papers.html.)
  • W.K. Hastings (1970), Monte Carlo sampling methods using Markov chains and their applications. Biometrika 57, 97--109.
  • J.P. Hobert, G.L. Jones, B. Presnell, and J.S. Rosenthal (2002), On the Applicability of Regenerative Simulation in Markov Chain Monte Carlo. Biometrika 89, 731--743.
  • I.A. Ibragimov (1963), A central limit theorem for a class of dependent random variables. Theory Prob. Appl. 8, 83--89.
  • I.A. Ibragimov and Y.V. Linnik (1971), Independent and Stationary Sequences of Random Variables. Wolters-Noordhoff, Groningen (English translation).
  • N. Jain and B. Jamison (1967), Contributions to Doeblin's theory of Markov processes. Z. Wahrsch. Verw. Geb. 8, 19--40.
  • S.F. Jarner and G.O. Roberts (2002), Polynomial convergence rates of Markov chains. Ann. Appl. Prob., 224--247, 2002.
  • G.L. Jones (2004), On the Central Limit Theorem for Markov Chains (review paper). Work in progress.
  • G.L. Jones and J.P. Hobert (2001), Honest exploration of intractable probability distributions via Markov chain Monte Carlo. Statistical Science 16, 312--334.
  • G.L. Jones and J.P. Hobert (2004), Sufficient burn-in for Gibbs samplers for a hierarchical random effects model. Ann. Stat. 32, 784--817.
  • W. Kendall and J. Møller (2000), Perfect simulation using dominating processes on ordered state spaces, with application to locally stable point processes. Adv. Appl. Prob. 32, 844--865.
  • C. Kipnis and S.R.S. Varadhan (1986), Central limit theorem for additive functionals of reversible Markov processes and applications to simple exclusions. Comm. Math. Phys. 104, 1-19.
  • T. Lindvall (1992), Lectures on the Coupling Method. Wiley & Sons, New York.
  • R. Lund, S.P. Meyn, and R.L. Tweedie (1996), Rates of convergence of stochastically monotone Markov processes. Ann. App. Prob. 6, 218--237.
  • A.A. Markov (1906), Extension of the law of large numbers to dependent quantities (in Russian). Izv. Fiz.-Matem. Obsch. Kazan Univ. (2nd Ser) 15, 135--156.
  • P. Matthews (1993), A slowly mixing Markov chain with implications for Gibbs sampling. Stat. Prob. Lett. 17, 231--236.
  • K. L. Mengersen and R. L. Tweedie (1996), Rates of convergence of the Hastings and Metropolis algorithms. Ann. Statist. 24, 101--121.
  • N. Metropolis, A. Rosenbluth, M. Rosenbluth, A. Teller, and E. Teller (1953), Equations of state calculations by fast computing machines. J. Chem. Phys. 21, 1087--1091.
  • S.P. Meyn and R.L. Tweedie (1993), Markov chains and stochastic stability. Springer-Verlag, London. Available at$\sim$meyn/pages/TOC.html.
  • S.P. Meyn and R.L. Tweedie (1994), Computable bounds for convergence rates of Markov chains. Ann. Appl. Prob. 4, 981--1011.
  • J. Møller, A.R. Syversveen, and R. Waagepetersen (1998), Log Gaussian Cox processes. Scand. J. Statist. 25, 451--482.
  • P.A. Mykland, L. Tierney, and B. Yu (1995), Regeneration in Markov chain samplers. J. Amer. Stat. Assoc. 90, 233--241.
  • R.M. Neal (2004), Improving Asymptotic Variance of MCMC Estimators: Non-reversible Chains are Better. Technical Report No. 0406, Dept. of Statistics, University of Toronto.
  • E. Nummelin (1978), Uniform and ratio limit theorems for Markov renewal and semi-regenerative processes on a general state space. Ann. Inst. Henri Poincaré Series B 14, 119--143.
  • E. Nummelin (1984), General irreducible Markov chains and non-negative operators. Cambridge University Press.
  • S. Orey (1971), Lecture notes on limit theorems for Markov chain transition probabilities. Van Nostrand Reinhold, London.
  • J.W. Pitman (1976), On coupling of Markov chains. Z. Wahrsch. verw. Gebiete 35, 315--322.
  • J.G. Propp and D.B. Wilson (1996). Exact sampling with coupled Markov chains and applications to statistical mechanics. Random Structures and Algorithms 9, 223--252.
  • D. Randall (2003), Mixing, a tutorial on Markov chains. FOCS 2003.
  • G.O. Roberts (1998), Optimal Metropolis algorithms for product measures on the vertices of a hypercube. Stochastics and Stochastic Reports 62, 275--283.
  • G.O. Roberts (1999), A note on acceptance rate criteria for CLTs for Metropolis-Hastings algorithms. J. Appl. Prob. 36, 1210--1217.
  • G.O. Roberts, A. Gelman, and W.R. Gilks (1997), Weak convergence and optimal scaling of random walk Metropolis algorithms. Ann. Appl. Prob. 7, 110--120.
  • G.O. Roberts and J.S. Rosenthal (1997), Shift-coupling and convergence rates of ergodic averages. Stochastic Models 13, 147--165.
  • G.O. Roberts and J.S. Rosenthal (1997), Geometric ergodicity and hybrid Markov chains. Electronic Comm. Prob. 2, Paper no. 2, 13--25.
  • G.O. Roberts and J.S. Rosenthal (1998), Optimal scaling of discrete approximations to Langevin diffusions. J. Roy. Stat. Soc. B 60, 255--268.
  • G.O. Roberts and J.S. Rosenthal (1998), Markov chain Monte Carlo: Some practical implications of theoretical results (with discussion). Canadian J. Stat. 26, 5--31.
  • G.O. Roberts and J.S. Rosenthal (2001), Small and Pseudo-Small Sets for Markov Chains. Stochastic Models 17, 121--145.
  • G.O. Roberts and J.S. Rosenthal (2001), Markov chains and de-initialising processes. Scandinavian Journal of Statistics 28, 489--504.
  • G.O. Roberts and J.S. Rosenthal (2001), Optimal scaling for various Metropolis-Hastings algorithms. Stat. Sci. 16, 351--367.
  • G.O. Roberts and J.S. Rosenthal (2003), Harris Recurrence of Metropolis-Within-Gibbs and Transdimensional MCMC Algorithms. In preparation.
  • G.O. Roberts and R.L. Tweedie (1996), Geometric Convergence and Central Limit Theorems for Multidimensional Hastings and Metropolis Algorithms. Biometrika 83, 95--110.
  • G.O. Roberts and R.L. Tweedie (1999), Bounds on regeneration times and convergence rates for Markov chains. Stoch. Proc. Appl. 80, 211--229. See also the corrigendum, Stoch. Proc. Appl. 91 (2001), 337--338.
  • J.S. Rosenthal (1993), Rates of convergence for data augmentation on finite sample spaces. Ann. Appl. Prob. 3, 819--839.
  • J.S. Rosenthal (1995), Rates of convergence for Gibbs sampler for variance components models. Ann. Stat. 23, 740--761.
  • J.S. Rosenthal (1995), Minorization conditions and convergence rates for Markov chain Monte Carlo. J. Amer. Stat. Assoc. 90, 558--566.
  • J.S. Rosenthal (1995), Convergence rates of Markov chains. SIAM Review 37, 387--405.
  • J.S. Rosenthal (1996), Convergence of Gibbs sampler for a model related to James-Stein estimators. Stat. and Comput. 6, 269--275.
  • J.S. Rosenthal (2000), A first look at rigorous probability theory. World Scientific Publishing, Singapore.
  • J.S. Rosenthal (2001), A review of asymptotic convergence for general state space Markov chains. Far East J. Theor. Stat. 5, 37--50.
  • J.S. Rosenthal (2002), Quantitative convergence rates of Markov chains: A simple account. Elec. Comm. Prob. 7, No. 13, 123--128.
  • J.S. Rosenthal (2003), Asymptotic Variance and Convergence Rates of Nearly-Periodic MCMC Algorithms. J. Amer. Stat. Assoc. 98, 169--177.
  • J.S. Rosenthal (2003), Geometric Convergence Rates for Time-Sampled Markov Chains. J. Theor. Prob. 16, 671--688.
  • A. Sinclair (1992), Improved bounds for mixing rates of Markov chains and multicommodity flow. Combinatorics, Prob., Comput. 1, 351--370.
  • A.F.M. Smith and G.O. Roberts (1993), Bayesian computation via the Gibbs sampler and related Markov chain Monte Carlo methods (with discussion). J. Roy. Stat. Soc. Ser. B 55, 3--24.
  • C. Stein (1971), A bound for the error in the normal approximation to the distribution of a sum of dependent random variables. In Proc. Sixth Berkeley Symp. Math. Statist. Prob. 3, 583--602. University of California Press.
  • H. Thorisson (2000), Coupling, Stationarity, and Regeneration. Springer, New York.
  • E. Thönnes, A Primer in Perfect Simulation. In Statistical Physics and Spatial Statistics (K.R. Mecke and D. Stoyan, ed.), Springer Lecture Notes in Physics, 349--378.
  • L. Tierney (1994), Markov chains for exploring posterior distributions (with discussion). Ann. Stat. 22, 1701--1762.
  • P. Tuominen and R.L. Tweedie (1994), Subgeometric rates of convergence of $f$-ergodic Markov chains. Adv. Appl. Prob. 26, 775--798.