The Annals of Applied Probability

On the stability of sequential Monte Carlo methods in high dimensions

Alexandros Beskos, Dan Crisan, and Ajay Jasra

Full-text: Access denied (no subscription detected) We're sorry, but we are unable to provide you with the full text of this article because we are not able to identify you as a subscriber. If you have a personal subscription to this journal, then please login. If you are already logged in, then you may need to update your profile to register your subscription. Read more about accessing full-text


We investigate the stability of a Sequential Monte Carlo (SMC) method applied to the problem of sampling from a target distribution on $\mathbb{R}^{d}$ for large $d$. It is well known [Bengtsson, Bickel and Li, In Probability and Statistics: Essays in Honor of David A. Freedman, D. Nolan and T. Speed, eds. (2008) 316–334 IMS; see also Pushing the Limits of Contemporary Statistics (2008) 318–329 IMS, Mon. Weather Rev. (2009) 136 (2009) 4629–4640] that using a single importance sampling step, one produces an approximation for the target that deteriorates as the dimension $d$ increases, unless the number of Monte Carlo samples $N$ increases at an exponential rate in $d$. We show that this degeneracy can be avoided by introducing a sequence of artificial targets, starting from a “simple” density and moving to the one of interest, using an SMC method to sample from the sequence; see, for example, Chopin [Biometrika 89 (2002) 539–551]; see also [J. R. Stat. Soc. Ser. B Stat. Methodol. 68 (2006) 411–436, Phys. Rev. Lett. 78 (1997) 2690–2693, Stat. Comput. 11 (2001) 125–139]. Using this class of SMC methods with a fixed number of samples, one can produce an approximation for which the effective sample size (ESS) converges to a random variable $\varepsilon_{N}$ as $d\rightarrow\infty$ with $1<\varepsilon_{N}<N$. The convergence is achieved with a computational cost proportional to $Nd^{2}$. If $\varepsilon_{N}\ll N$, we can raise its value by introducing a number of resampling steps, say $m$ (where $m$ is independent of $d$). In this case, the ESS converges to a random variable $\varepsilon_{N,m}$ as $d\rightarrow\infty$ and $\lim_{m\to\infty}\varepsilon_{N,m}=N$. Also, we show that the Monte Carlo error for estimating a fixed-dimensional marginal expectation is of order $\frac{1}{\sqrt{N}}$ uniformly in $d$. The results imply that, in high dimensions, SMC algorithms can efficiently control the variability of the importance sampling weights and estimate fixed-dimensional marginals at a cost which is less than exponential in $d$ and indicate that resampling leads to a reduction in the Monte Carlo error and increase in the ESS. All of our analysis is made under the assumption that the target density is i.i.d.

Article information

Ann. Appl. Probab. Volume 24, Number 4 (2014), 1396-1445.

First available in Project Euclid: 14 May 2014

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

Primary: 82C80: Numerical methods (Monte Carlo, series resummation, etc.) 60K35: Interacting random processes; statistical mechanics type models; percolation theory [See also 82B43, 82C43]
Secondary: 60F99: None of the above, but in this section 62F15: Bayesian inference

Sequential Monte Carlo high dimensions resampling functional CLT


Beskos, Alexandros; Crisan, Dan; Jasra, Ajay. On the stability of sequential Monte Carlo methods in high dimensions. Ann. Appl. Probab. 24 (2014), no. 4, 1396--1445. doi:10.1214/13-AAP951.

Export citation


  • [1] Andrieu, C., Jasra, A., Doucet, A. and Del Moral, P. (2011). On non-linear Markov chain Monte Carlo. Bernoulli 17 987–1014.
  • [2] Andrieu, C. and Moulines, É. (2006). On the ergodicity properties of some adaptive MCMC algorithms. Ann. Appl. Probab. 16 1462–1505.
  • [3] Atchadé, Y. (2009). A strong law of large numbers for martingale arrays. Technical report, Univ. Michigan.
  • [4] Baehr, C. and Pannekoucke, O. (2010). Some issues and results on the EnKF and particle filters for meteorological models. In Chaotic Systems (C. H. Skiadas and I. Dimotikalis, eds.) 27–34. World Sci. Publ., River Edge, NJ.
  • [5] Bédard, M. (2007). Weak convergence of Metropolis algorithms for non-i.i.d. target distributions. Ann. Appl. Probab. 17 1222–1244.
  • [6] Bengtsson, T., Bickel, P. and Li, B. (2008). Curse-of-dimensionality revisited: Collapse of the particle filter in very large scale systems. In Probability and Statistics: Essays in Honor of David A. Freedman (D. Nolan and T. Speed, eds.) 316–334. IMS, Beachwood, OH.
  • [7] Berger, E. (1986). Asymptotic behaviour of a class of stochastic approximation procedures. Probab. Theory Related Fields 71 517–552.
  • [8] Beskos, A., Crisan, D. and Jasra, A. (2011). On the stability of sequential Monte Carlo methods in high-dimensions. Technical report, Imperial College London.
  • [9] Beskos, A., Crisan, D., Jasra, A. and Whiteley, N. (2014). Error bounds and normalizing constants for sequential Monte Carlo. Adv. in Appl. Probab. To appear.
  • [10] Beskos, A., Pillai, N., Roberts, G., Sanz-Serna, J.-M. and Stuart, A. (2013). Optimal tuning of the hybrid Monte Carlo algorithm. Bernoulli 19 1501–1534.
  • [11] Beskos, A., Roberts, G. and Stuart, A. (2009). Optimal scalings for local Metropolis–Hastings chains on nonproduct targets in high dimensions. Ann. Appl. Probab. 19 863–898.
  • [12] Beskos, A. and Stuart, A. (2007). MCMC methods for sampling function space. In ICIAM 076th International Congress on Industrial and Applied Mathematics. Zürich.
  • [13] Bickel, P., Li, B. and Bengtsson, T. (2008). Sharp failure rates for the bootstrap particle filter in high dimensions. In Pushing the Limits of Contemporary Statistics (B. Clarke and S. Ghosal, eds.) 318–329. IMS, Beachwood, OH.
  • [14] Billingsley, P. (1999). Convergence of Probability Measures, 2nd ed. Wiley, New York.
  • [15] Breyer, L. A., Piccioni, M. and Scarlatti, S. (2004). Optimal scaling of MaLa for nonlinear regression. Ann. Appl. Probab. 14 1479–1505.
  • [16] Breyer, L. A. and Roberts, G. O. (2000). From Metropolis to diffusions: Gibbs states and optimal scaling. Stochastic Process. Appl. 90 181–206.
  • [17] Cappé, O., Guillin, A., Marin, J. M. and Robert, C. P. (2004). Population Monte Carlo. J. Comput. Graph. Statist. 13 907–929.
  • [18] Chopin, N. (2002). A sequential particle filter method for static models. Biometrika 89 539–551.
  • [19] Chopin, N. (2004). Central limit theorem for sequential Monte Carlo methods and its application to Bayesian inference. Ann. Statist. 32 2385–2411.
  • [20] Crisan, D. and Rozovsky, B. (2011). The Oxford Handbook of Nonlinear Filtering. Oxford Univ. Press, Oxford.
  • [21] Del Moral, P. (2004). Feynman–Kac Formulae: Genealogical and Interacting Particle Systems With Applications. Springer, New York.
  • [22] Del Moral, P., Doucet, A. and Jasra, A. (2006). Sequential Monte Carlo samplers. J. R. Stat. Soc. Ser. B Stat. Methodol. 68 411–436.
  • [23] Del Moral, P., Doucet, A. and Jasra, A. (2012). An adaptive sequential Monte Carlo method for approximate Bayesian computation. Stat. Comput. 22 1009–1020.
  • [24] Del Moral, P., Doucet, A. and Jasra, A. (2012). On adaptive resampling procedures for sequential Monte Carlo methods. Bernoulli 18 252–278.
  • [25] Del Moral, P., Patras, F. and Rubenthaler, S. (2009). Coalescent tree based functional representations for some Feynman–Kac particle models. Ann. Appl. Probab. 19 778–825.
  • [26] Douc, R. and Moulines, E. (2008). Limit theorems for weighted samples with applications to sequential Monte Carlo methods. Ann. Statist. 36 2344–2376.
  • [27] Douc, R., Moulines, E. and Rosenthal, J. S. (2004). Quantitative bounds on convergence of time-inhomogeneous Markov chains. Ann. Appl. Probab. 14 1643–1665.
  • [28] Doucet, A., De Freitas, N. and Gordon, N., eds. (2001). Sequential Monte Carlo Methods in Practice. Springer, New York.
  • [29] Hall, P. and Heyde, C. C. (1980). Martingale Limit Theory and Its Application. Academic Press, New York.
  • [30] Heine, K. and Crisan, D. (2008). Uniform approximations of discrete-time filters. Adv. in Appl. Probab. 40 979–1001.
  • [31] Jarzynski, C. (1997). Nonequilibrium equality for free energy differences. Phys. Rev. Lett. 78 2690–2693.
  • [32] Jasra, A., Stephens, D. A., Doucet, A. and Tsagaris, T. (2011). Inference for Lévy-driven stochastic volatility models via adaptive sequential Monte Carlo. Scand. J. Stat. 38 1–22.
  • [33] Jasra, A., Stephens, D. A. and Holmes, C. C. (2007). On population-based simulation for static inference. Stat. Comput. 17 263–279.
  • [34] Kong, A., Liu, J. S. and Wong, W. H. (1994). Sequential imputations and Bayesian missing data problems. J. Amer. Statist. Assoc. 89 278–288.
  • [35] Künsch, H. R. (2005). Recursive Monte Carlo filters: Algorithms and theoretical analysis. Ann. Statist. 33 1983–2021.
  • [36] Lee, A., Yau, C., Giles, M., Doucet, A. and Holmes, C. C. (2010). On the utility of graphics cards to perform massively parallel implementation of advanced Monte Carlo methods. J. Comput. Graph. Statist. 19 769–789.
  • [37] Liu, J. S. (2001). Monte Carlo Strategies in Scientific Computing. Springer, New York.
  • [38] Mattingly, J. C., Pillai, N. S. and Stuart, A. M. (2012). Diffusion limits of the random walk Metropolis algorithm in high dimensions. Ann. Appl. Probab. 22 881–930.
  • [39] McLeish, D. L. (1974). Dependent central limit theorems and invariance principles. Ann. Probab. 2 620–628.
  • [40] Meyn, S. and Tweedie, R. L. (2009). Markov chains and Stochastic Stability, 2nd ed. Cambridge Univ. Press, Cambridge.
  • [41] Neal, R. M. (2001). Annealed importance sampling. Stat. Comput. 11 125–139.
  • [42] Pillai, N. S., Stuart, A. M. and Thiéry, A. H. (2012). Optimal scaling and diffusion limits for the Langevin algorithm in high dimensions. Ann. Appl. Probab. 22 2320–2356.
  • [43] Roberts, G. O., Gelman, A. and Gilks, W. R. (1997). Weak convergence and optimal scaling of random walk Metropolis algorithms. Ann. Appl. Probab. 7 110–120.
  • [44] Roberts, G. O. and Rosenthal, J. S. (1998). Optimal scaling of discrete approximations to Langevin diffusions. J. R. Stat. Soc. Ser. B Stat. Methodol. 60 255–268.
  • [45] Rudin, W. (1976). Principles of Mathematical Analysis, 3rd ed. McGraw-Hill, New York.
  • [46] Schäfer, C. and Chopin, N. (2013). Sequential Monte Carlo on large binary sampling spaces. Stat. Comput. 23 163–184.
  • [47] Shiryaev, A. N. (1996). Probability, 2nd ed. Springer, New York.
  • [48] Snyder, C., Bengtsson, T., Bickel, P. and Anderson, J. (2009). Obstacles to high-dimensional particle filtering. Mon. Weather Rev. 136 4629–4640.
  • [49] Whiteley, N. (2012). Sequential Monte Carlo samplers: Error bounds and insensitivity to initial conditions. Stoch. Anal. Appl. 30 774–798.
  • [50] Whiteley, N. (2013). Stability properties of some particle filters. Ann. Appl. Probab. 23 2500–2537.
  • [51] Whiteley, N., Kantas, N. and Jasra, A. (2012). Linear variance bounds for particle approximations of time-homogeneous Feynman–Kac formulae. Stochastic Process. Appl. 122 1840–1865.
  • [52] Withers, C. S. (1981). Central limit theorems for dependent variables. I. Z. Wahrsch. Verw. Gebiete 57 509–534.