Open Access
August 2014 On the stability of sequential Monte Carlo methods in high dimensions
Alexandros Beskos, Dan Crisan, Ajay Jasra
Ann. Appl. Probab. 24(4): 1396-1445 (August 2014). DOI: 10.1214/13-AAP951

Abstract

We investigate the stability of a Sequential Monte Carlo (SMC) method applied to the problem of sampling from a target distribution on $\mathbb{R}^{d}$ for large $d$. It is well known [Bengtsson, Bickel and Li, In Probability and Statistics: Essays in Honor of David A. Freedman, D. Nolan and T. Speed, eds. (2008) 316–334 IMS; see also Pushing the Limits of Contemporary Statistics (2008) 318–329 IMS, Mon. Weather Rev. (2009) 136 (2009) 4629–4640] that using a single importance sampling step, one produces an approximation for the target that deteriorates as the dimension $d$ increases, unless the number of Monte Carlo samples $N$ increases at an exponential rate in $d$. We show that this degeneracy can be avoided by introducing a sequence of artificial targets, starting from a “simple” density and moving to the one of interest, using an SMC method to sample from the sequence; see, for example, Chopin [Biometrika 89 (2002) 539–551]; see also [J. R. Stat. Soc. Ser. B Stat. Methodol. 68 (2006) 411–436, Phys. Rev. Lett. 78 (1997) 2690–2693, Stat. Comput. 11 (2001) 125–139]. Using this class of SMC methods with a fixed number of samples, one can produce an approximation for which the effective sample size (ESS) converges to a random variable $\varepsilon_{N}$ as $d\rightarrow\infty$ with $1<\varepsilon_{N}<N$. The convergence is achieved with a computational cost proportional to $Nd^{2}$. If $\varepsilon_{N}\ll N$, we can raise its value by introducing a number of resampling steps, say $m$ (where $m$ is independent of $d$). In this case, the ESS converges to a random variable $\varepsilon_{N,m}$ as $d\rightarrow\infty$ and $\lim_{m\to\infty}\varepsilon_{N,m}=N$. Also, we show that the Monte Carlo error for estimating a fixed-dimensional marginal expectation is of order $\frac{1}{\sqrt{N}}$ uniformly in $d$. The results imply that, in high dimensions, SMC algorithms can efficiently control the variability of the importance sampling weights and estimate fixed-dimensional marginals at a cost which is less than exponential in $d$ and indicate that resampling leads to a reduction in the Monte Carlo error and increase in the ESS. All of our analysis is made under the assumption that the target density is i.i.d.

Citation

Download Citation

Alexandros Beskos. Dan Crisan. Ajay Jasra. "On the stability of sequential Monte Carlo methods in high dimensions." Ann. Appl. Probab. 24 (4) 1396 - 1445, August 2014. https://doi.org/10.1214/13-AAP951

Information

Published: August 2014
First available in Project Euclid: 14 May 2014

zbMATH: 1304.82070
MathSciNet: MR3211000
Digital Object Identifier: 10.1214/13-AAP951

Subjects:
Primary: 60K35 , 82C80
Secondary: 60F99 , 62F15

Keywords: functional CLT , high dimensions , Resampling , sequential Monte Carlo

Rights: Copyright © 2014 Institute of Mathematical Statistics

Vol.24 • No. 4 • August 2014
Back to Top