Bernoulli

  • Bernoulli
  • Volume 24, Number 4B (2018), 3711-3750.

Efficient strategy for the Markov chain Monte Carlo in high-dimension with heavy-tailed target probability distribution

Kengo Kamatani

Full-text: Open access

Abstract

The purpose of this paper is to introduce a new Markov chain Monte Carlo method and to express its effectiveness by simulation and high-dimensional asymptotic theory. The key fact is that our algorithm has a reversible proposal kernel, which is designed to have a heavy-tailed invariant probability distribution. A high-dimensional asymptotic theory is studied for a class of heavy-tailed target probability distributions. When the number of dimensions of the state space passes to infinity, we will show that our algorithm has a much higher convergence rate than the pre-conditioned Crank–Nicolson (pCN) algorithm and the random-walk Metropolis algorithm.

Article information

Source
Bernoulli, Volume 24, Number 4B (2018), 3711-3750.

Dates
Received: January 2015
Revised: March 2017
First available in Project Euclid: 18 April 2018

Permanent link to this document
https://projecteuclid.org/euclid.bj/1524038768

Digital Object Identifier
doi:10.3150/17-BEJ976

Mathematical Reviews number (MathSciNet)
MR3788187

Zentralblatt MATH identifier
06869890

Keywords
Consistency Malliavin calculus Markov chain Monte Carlo Stein’s method

Citation

Kamatani, Kengo. Efficient strategy for the Markov chain Monte Carlo in high-dimension with heavy-tailed target probability distribution. Bernoulli 24 (2018), no. 4B, 3711--3750. doi:10.3150/17-BEJ976. https://projecteuclid.org/euclid.bj/1524038768


Export citation

References

  • [1] Beskos, A., Roberts, G. and Stuart, A. (2009). Optimal scalings for local Metropolis–Hastings chains on nonproduct targets in high dimensions. Ann. Appl. Probab. 19 863–898.
  • [2] Chen, L.H.Y., Goldstein, L. and Shao, Q.-M. (2011). Normal Approximation by Stein’s Method. Probability and Its Applications. Springer, Berlin.
  • [3] Cotter, S.L., Roberts, G.O., Stuart, A.M. and White, D. (2013). MCMC methods for functions: Modifying old algorithms to make them faster. Statist. Sci. 28 424–446.
  • [4] Eberle, A. (2014). Error bounds for Metropolis–Hastings algorithms applied to perturbations of Gaussian measures in high dimensions. Ann. Appl. Probab. 24 337–377.
  • [5] Ethier, S.N. and Kurtz, T.G. (1986). Markov Processes: Characterization and Convergence. New York: Wiley.
  • [6] Geyer, C.J. (1992). Practical Markov chain Monte Carlo. Statist. Sci. 7 473–483.
  • [7] Goto, F. (2017). An extension and practical performances of mpcn algorithm. Master’s thesis, Graduate School of Engineering Science, Osaka University.
  • [8] Hairer, M., Stuart, A.M. and Vollmer, S.J. (2014). Spectral gaps for a Metropolis–Hastings algorithm in infinite dimensions. Ann. Appl. Probab. 24 2455–2490.
  • [9] Jacod, J. and Shiryaev, A.N. (2003). Limit Theorems for Stochastic Processes, 2nd ed. Grundlehren der Mathematischen Wissenschaften [Fundamental Principles of Mathematical Sciences] 288. Berlin: Springer.
  • [10] Kamatani, K. (2014). Rate optimality of Random walk Metropolis algorithm in high-dimension with heavy-tailed target distribution. Preprint. Available at arXiv:1406.5392.
  • [11] Kamatani, K. (2014). Local consistency of Markov chain Monte Carlo methods. Ann. Inst. Statist. Math. 66 63–74.
  • [12] Kamatani, K. (2017). Ergodicity of Markov chain Monte Carlo with reversible proposal. J. Appl. Probab. 54 638–654.
  • [13] Kamatani, K., Nogita, A. and Uchida, M. (2016). Hybrid multi-step estimation of the volatility for stochastic regression models. Bull. Inform. Cybernet. 48 19–35.
  • [14] Kamatani, K. and Uchida, M. (2015). Hybrid multi-step estimators for stochastic differential equations based on sampled data. Stat. Inference Stoch. Process. 18 177–204.
  • [15] Karatzas, I. and Shreve, S.E. (1991). Brownian Motion and Stochastic Calculus, 2nd ed. Graduate Texts in Mathematics 113. New York: Springer.
  • [16] Kotz, S. and Nadarajah, S. (2004). Multivariate $t$ Distributions and Their Applications. Cambridge: Cambridge Univ. Press.
  • [17] Neal, R.M. (1999). Regression and classification using Gaussian process priors. In Bayesian Statistics, 6 (Alcoceber, 1998) 475–501. New York: Oxford Univ. Press.
  • [18] Nourdin, I. and Peccati, G. (2009). Stein’s method on Wiener chaos. Probab. Theory Related Fields 145 75–118.
  • [19] Nourdin, I. and Peccati, G. (2012). Normal Approximations with Malliavin Calculus, from Stein’s Method to Universality. Cambridge Tracts in Mathematics 192. Cambridge: Cambridge Univ. Press.
  • [20] Nualart, D. (2006). The Malliavin Calculus and Related Topics, 2nd ed. Berlin: Springer.
  • [21] Pillai, N.S., Stuart, A.M. and Thiery, A.H. (2014). Optimal proposal design for random walk type Metropolis Algorithms with Gaussian random field priors. Preprint. Available at arXiv:1108.1494v2.
  • [22] Plummer, M., Best, N., Cowles, K. and Vines, K. (2006). Coda: Convergence diagnosis and output analysis for mcmc. R News 6 7–11.
  • [23] Robert, C.P. and Casella, G. (2004). Monte Carlo Statistical Methods, 2nd ed. New York: Springer.
  • [24] Roberts, G.O., Gelman, A. and Gilks, W.R. (1997). Weak convergence and optimal scaling of random walk Metropolis algorithms. Ann. Appl. Probab. 7 110–120.
  • [25] Roberts, G.O. and Rosenthal, J.S. (1998). Optimal scaling of discrete approximations to Langevin diffusions. J. R. Stat. Soc. Ser. B. Stat. Methodol. 60 255–268.
  • [26] Sato, K. (1999). Lévy Processes and Infinitely Divisible Distributions. Cambridge Studies in Advanced Mathematics 68. Cambridge: Cambridge Univ. Press.
  • [27] Shigekawa, I. (1980). Derivatives of Wiener functionals and absolute continuity of induced measures. J. Math. Kyoto Univ. 20 263–289.
  • [28] Stroock, D.W. and Varadhan, S.R.S. (1979). Multidimensional Diffussion Processes. Grundlehren der Mathematischen Wissenschaften in Einzeldarstellungen Mit Besonderer Berücksichtigung der Anwendungsgebiete. Berlin: Springer.
  • [29] Tierney, L. (1994). Markov chains for exploring posterior distributions. Ann. Statist. 22 1701–1762.