Journal of Applied Probability

Complexity bounds for Markov chain Monte Carlo algorithms via diffusion limits

Gareth O. Roberts and Jeffrey S. Rosenthal

Full-text: Access denied (no subscription detected)

We're sorry, but we are unable to provide you with the full text of this article because we are not able to identify you as a subscriber. If you have a personal subscription to this journal, then please login. If you are already logged in, then you may need to update your profile to register your subscription. Read more about accessing full-text

Abstract

We connect known results about diffusion limits of Markov chain Monte Carlo (MCMC) algorithms to the computer science notion of algorithm complexity. Our main result states that any weak limit of a Markov process implies a corresponding complexity bound (in an appropriate metric). We then combine this result with previously-known MCMC diffusion limit results to prove that under appropriate assumptions, the random-walk Metropolis algorithm in $d$ dimensions takes $O(d)$ iterations to converge to stationarity, while the Metropolis-adjusted Langevin algorithm takes $O(d^{1/3})$ iterations to converge to stationarity.

Article information

Source
J. Appl. Probab., Volume 53, Number 2 (2016), 410-420.

Dates
First available in Project Euclid: 17 June 2016

Permanent link to this document
https://projecteuclid.org/euclid.jap/1466172863

Mathematical Reviews number (MathSciNet)
MR3514287

Zentralblatt MATH identifier
1345.60082

Subjects
Primary: 60J05: Discrete-time Markov processes on general state spaces 60J25: Continuous-time Markov processes on general state spaces
Secondary: 62F10: Point estimation 62F15: Bayesian inference

Keywords
MCMC convergence complexity diffusion limit random-walk Metropolis algorithm Metropolis-adjusted Langevin algorithm

Citation

Roberts, Gareth O.; Rosenthal, Jeffrey S. Complexity bounds for Markov chain Monte Carlo algorithms via diffusion limits. J. Appl. Probab. 53 (2016), no. 2, 410--420. https://projecteuclid.org/euclid.jap/1466172863


Export citation

References

  • Aldous, D. and Fill, J. A. (2014). Reversible Markov chains and random walks on graphs. Unfinished monograph. Available at http://www.stat.berkeley.edu/$\sim$aldous/RWG/book.html.
  • Bédard, M. (2007). Weak convergence of Metropolis algorithms for non-i.i.d. target distributions. Ann. Appl. Prob. 17, 1222–1244.
  • Bédard, M. (2008). Optimal acceptance rates for Metropolis algorithms: Moving beyond 0.234. Stoch. Process. Appl. 118, 2198–2222.
  • Brooks, S., Gelman, A., Jones, G. L. and Meng, X.-L. (eds) (2011). Handbook of Markov chain Monte Carlo. Chapman & Hall/CRC, Boca Raton, FL.
  • Cobham, A. (1965). The intrinsic computational difficulty of functions. In Proceedings of the 1964 International Congress for Logic, Methodology, and Philosophy of Science, North-Holland, Amsterdam, pp. 24–30.
  • Cook, S. A. (1971). The complexity of theorem-proving procedures. In Proc. Third Annual ACM Symposium on Theory of Computing, ACM, New York, pp. 151–158.
  • Ethier, S. N. and Kurtz, T. G. (1986). Markov Processes: Characterization and Convergence. John Wiley, New York.
  • Gelman, A. and Rubin, D. B. (1992). Inference from iterative simulation using multiple sequences. Statist. Sci. 7, 457–472.
  • Givens, C. R. and Shortt, R. M. (1984). A class of Wasserstein metrics for probability distributions. Michigan Math. J. 31, 231–240.
  • Jones, G. L. and Hobert, J. P. (2001). Honest exploration of intractable probability distributions via Markov chain Monte Carlo. Statist. Sci. 16, 312–334.
  • Jones, G. L. and Hobert, J. P. (2004). Sufficient burn-in for Gibbs samplers for a hierarchical random effects model. Ann. Statist. 32, 784–817.
  • Jourdain, B., Lelièvre, T. and Miasojedow, B. (2015). Optimal scaling for the transient phase of the Metropolis Hastings algorithm: The mean-field limit. Ann. Appl. Prob. 25, 2263–2300
  • Jourdain, B., Lelièvre, T. and Miasojedow, B. (2014). Optimal scaling for the transient phase of Metropolis Hastings algorithms: the longtime behavior. Bernoulli 20, 1930–1978.
  • Kantorovič, L. and Rubinšteǐn, G. Š. (1958). On a space of completely additive functions. Vestnik Leningrad. Univ. 13, 52–59.
  • Mengersen, K. L. and Tweedie, R. L. (1996). Rates of convergence of the Hastings and Metropolis algorithms. Ann. Statist. 24, 101–121.
  • Neal, P. and Roberts, G. (2006). Optimal scaling for partially updating MCMC algorithms. Ann. Appl. Prob. 16, 475–515.
  • Neal, P. and Roberts, G. (2008). Optimal scaling for random walk Metropolis on spherically constrained target densities. Methodol. Comput. Appl. Prob. 10, 277–297.
  • Neal, P. and Roberts, G. (2011). Optimal scaling of random walk Metropolis algorithms with non-Gaussian proposals. Methodol. Comput. Appl. Prob. 13, 583–601.
  • Neal, P., Roberts, G. and Yuen, W. K. (2012). Optimal scaling of random walk Metropolis algorithms with discontinuous target densities. Ann. Appl. Prob. 22, 1880–1927.
  • Roberts, G. O. (1998). Optimal Metropolis algorithms for product measures on the vertices of a hypercube. Stoch. Stoch. Reports 62, 275–283.
  • Roberts, G. O. and Rosenthal, J. S. (1997). Geometric ergodicity and hybrid Markov chains. Electron. Commun. Prob. 2, 13–25.
  • Roberts, G. O. and Rosenthal, J. S. (1998). Optimal scaling of discrete approximations to Langevin diffusions. J. R. Statist. Soc. B 60, 255–268.
  • Roberts, G. O. and Rosenthal, J. S. (2001). Optimal scaling for various Metropolis–Hastings algorithms. Statist. Sci. 16, 351–367.
  • Roberts, G. O. and Rosenthal, J. S. (2004). General state space Markov chains and MCMC algorithms. Prob. Surv. 1, 20–71.
  • Roberts, G. O., Gelman, A. and Gilks, W. R. (1997). Weak convergence and optimal scaling of random walk Metropolis algorithms. Ann. Appl. Prob. 7, 110–120.
  • Rosenthal, J. S. (1995a). Minorization conditions and convergence rates for Markov chain Monte Carlo. J. Amer. Statist. Assoc. 90, 558–566, 1136.
  • Rosenthal, J. S. (1995b). Rates of convergence for Gibbs sampler for variance components models. Ann. Statist. 23, 740–761.
  • Rosenthal, J. S. (1996). Analysis of the Gibbs sampler for a model related to James–Stein estimators. Statist. Comput. 6, 269–275.
  • Rosenthal, J. S. (2000). A First Look at Rigorous Probability Theory. World Scientific, River Edge, NJ.
  • Rosenthal, J. S. (2002). Quantitative convergence rates of Markov chains: a simple account. Electron. Commun. Prob. 7, 123–128.
  • Sherlock, C. and Roberts, G. O. (2009). Optimal scaling of the random walk Metropolis on elliptically symmetric unimodal targets. Bernoulli 15, 774–798.
  • Woodard, D. B., Schmidler, S. C. and Huber, M. L. (2009a). Conditions for rapid mixing of parallel and simulated tempering on multimodal distributions. Ann. Appl. Prob. 19, 617–640.
  • Woodard, D. B., Schmidler, S. C. and Huber, M. L. (2009b). Sufficient conditions for torpid mixing of parallel and simulated tempering. Electron. J. Prob. 14, 780–804. \endharvreferences