The Annals of Statistics

Recursive Monte Carlo filters: Algorithms and theoretical analysis

Hans R. Künsch

Full-text: Open access


Recursive Monte Carlo filters, also called particle filters, are a powerful tool to perform computations in general state space models. We discuss and compare the accept–reject version with the more common sampling importance resampling version of the algorithm. In particular, we show how auxiliary variable methods and stratification can be used in the accept–reject version, and we compare different resampling techniques. In a second part, we show laws of large numbers and a central limit theorem for these Monte Carlo filters by simple induction arguments that need only weak conditions. We also show that, under stronger conditions, the required sample size is independent of the length of the observed series.

Article information

Ann. Statist., Volume 33, Number 5 (2005), 1983-2021.

First available in Project Euclid: 25 November 2005

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

Primary: 62M09: Non-Markovian processes: estimation
Secondary: 60G35: Signal detection and filtering [See also 62M20, 93E10, 93E11, 94Axx] 60J22: Computational methods in Markov chains [See also 65C40] 65C05: Monte Carlo methods

State space models hidden Markov models filtering and smoothing particle filters auxiliary variables sampling importance resampling central limit theorem


Künsch, Hans R. Recursive Monte Carlo filters: Algorithms and theoretical analysis. Ann. Statist. 33 (2005), no. 5, 1983--2021. doi:10.1214/009053605000000426.

Export citation


  • Atar, R. and Zeitouni, O. (1997). Exponential stability for nonlinear filtering. Ann. Inst. H. Poincaré Probab. Statist. 33 697–725.
  • Carpenter, J., Clifford, P. and Fearnhead, P. (1999). Improved particle filter for nonlinear problems. IEE Proceedings F, Radar, Sonar and Navigation 146 2–7.
  • Chopin, N. (2004). Central limit theorem for sequential Monte Carlo methods and its application to Bayesian inference. Ann. Statist. 32 2385–2411.
  • Crisan, D. (2001). Particle filters–-A theoretical perspective. In Sequential Monte Carlo Methods in Practice (A. Doucet, N. de Freitas and N. Gordon, eds.) 17–41. Springer, Berlin.
  • Crisan, D., Del Moral, P. and Lyons, T. (1999). Discrete filtering using branching and interacting particle systems. Markov Process. Related Fields 5 293–318.
  • Crisan, D. and Lyons, T. (2002). Minimal entropy approximations and optimal algorithms. Monte Carlo Methods Appl. 8 343–355.
  • Del Moral, P. and Guionnet, A. (2001). On the stability of interacting processes with applications to filtering and genetic algorithms. Ann. Inst. H. Poincaré Probab. Statist. 37 155–194.
  • Del Moral, P. and Miclo, L. (2000). Branching and interacting particle systems. Approximations of Feynman–Kac formulae with applications to non-linear filtering. Séminaire de Probabilités XXXIV. Lecture Notes in Math. 1729 1–145. Springer, Berlin.
  • Devroye, L. (1987). A Course in Density Estimation. Birkhäuser, Basel.
  • Dobrushin, R. L. (1956). Central limit theorem for non-stationary Markov chains. I, II. Theory Probab. Appl. 1 65–80, 329–383.
  • Douc, R., Moulines, E. and Rydén, T. (2004). Asymptotic properties of the maximum likelihood estimator in autoregressive models with Markov regime. Ann. Statist. 32 2254–2304.
  • Doucet, A., de Freitas, N. and Gordon, N., eds. (2001). Sequential Monte Carlo Methods in Practice. Springer, New York.
  • Frühwirth-Schnatter, S. (1994). Data augmentation and dynamic linear models. J. Time Ser. Anal. 15 183–202.
  • Godsill, S. J., Doucet, A. and West, M. (2004). Monte Carlo smoothing for nonlinear time series. J. Amer. Statist. Assoc. 99 156–168.
  • Gordon, N. J., Salmond, D. J. and Smith, A. F. M. (1993). Novel approach to nonlinear/non-Gaussian Bayesian state estimation. IEE Proceedings F, Radar and Signal Processing 140 107–113.
  • Hannan, E. J. and Deistler, M. (1988). The Statistical Theory of Linear Systems. Wiley, New York.
  • Harvey, A. C. (1989). Forecasting, Structural Time Series Models and the Kalman Filter. Cambridge Univ. Press.
  • Hürzeler, M. and Künsch, H. R. (1998). Monte Carlo approximations for general state-space models. J. Comput. Graph. Statist. 7 175–193.
  • Hürzeler, M. and Künsch, H. R. (2001). Approximating and maximizing the likelihood for a general state space model. In Sequential Monte Carlo Methods in Practice (A. Doucet, N. de Freitas and N. Gordon, eds.) 159–175. Springer, New York.
  • Künsch, H. R. (2001). State space and hidden Markov models. In Complex Stochastic Systems (O. E. Barndorff-Nielsen, D. R. Cox and C. Klüppelberg, eds.) 109–173. Chapman and Hall/CRC, Boca Raton, FL.
  • Le Gland, F. and Oudjane, N. (2004). Stability and uniform approximation of nonlinear filters using the Hilbert metric and application to particle filters. Ann. Appl. Probab. 14 144–187.
  • Liu, J. S. and Chen, R. (1998). Sequential Monte Carlo methods for dynamic systems. J. Amer. Statist. Assoc. 93 1032–1044.
  • Pitt, M. K. and Shephard, N. (1999). Filtering via simulation: Auxiliary particle filters. J. Amer. Statist. Assoc. 94 590–599.
  • Pitt, M. K. and Shephard, N. (2001). Auxiliary variable based particle filters. In Sequential Monte Carlo Methods in Practice (A. Doucet, N. de Freitas and N. Gordon, eds.) 273–293. Springer, New York.
  • Robert, C. P. and Casella, G. (1999). Monte Carlo Statistical Methods. Springer, New York.
  • Rubin, D. (1988). Using the SIR algorithm to simulate posterior distributions. In Bayesian Statistics 3 (J. M. Bernardo, M. H. DeGroot, D. V. Lindley and A. F. M. Smith, eds.) 395–402. Oxford Univ. Press.
  • Shephard, N. (1996). Statistical aspects of ARCH and stochastic volatility. In Time Series Models: In Econometrics, Finance and Other Fields (D. R. Cox, D. V. Hinkley and O. E. Barndorff-Nielsen, eds.) 1–67. Chapman and Hall, London.
  • Whitley, D. (1994). A genetic algorithm tutorial. Stat. Comput. 4 65–85.