• Bernoulli
  • Volume 25, Number 2 (2019), 1504-1535.

Numerically stable online estimation of variance in particle filters

Jimmy Olsson and Randal Douc

Full-text: Access denied (no subscription detected)

We're sorry, but we are unable to provide you with the full text of this article because we are not able to identify you as a subscriber. If you have a personal subscription to this journal, then please login. If you are already logged in, then you may need to update your profile to register your subscription. Read more about accessing full-text


This paper discusses variance estimation in sequential Monte Carlo methods, alternatively termed particle filters. The variance estimator that we propose is a natural modification of that suggested by H.P. Chan and T.L. Lai [Ann. Statist. 41 (2013) 2877–2904], which allows the variance to be estimated in a single run of the particle filter by tracing the genealogical history of the particles. However, due particle lineage degeneracy, the estimator of the mentioned work becomes numerically unstable as the number of sequential particle updates increases. Thus, by tracing only a part of the particles’ genealogy rather than the full one, our estimator gains long-term numerical stability at the cost of a bias. The scope of the genealogical tracing is regulated by a lag, and under mild, easily checked model assumptions, we prove that the bias tends to zero geometrically fast as the lag increases. As confirmed by our numerical results, this allows the bias to be tightly controlled also for moderate particle sample sizes.

Article information

Bernoulli, Volume 25, Number 2 (2019), 1504-1535.

Received: January 2017
Revised: October 2017
First available in Project Euclid: 6 March 2019

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

asymptotic variance Feynman–Kac models hidden Markov models particle filters sequential Monte Carlo methods state-space models variance estimation


Olsson, Jimmy; Douc, Randal. Numerically stable online estimation of variance in particle filters. Bernoulli 25 (2019), no. 2, 1504--1535. doi:10.3150/18-BEJ1028.

Export citation


  • [1] Cappé, O., Moulines, E. and Rydén, T. (2005). Inference in Hidden Markov Models. Springer Series in Statistics. New York: Springer.
  • [2] Chan, H.P. and Lai, T.L. (2013). A general theory of particle filters in hidden Markov models and some applications. Ann. Statist. 41 2877–2904.
  • [3] Chopin, N. (2004). Central limit theorem for sequential Monte Carlo methods and its application to Bayesian inference. Ann. Statist. 32 2385–2411.
  • [4] Crisan, D. and Heine, K. (2008). Stability of the discrete time filter in terms of the tails of noise distributions. J. Lond. Math. Soc. (2) 78 441–458.
  • [5] Del Moral, P. (2004). Feynman–Kac Formulae: Genealogical and Interacting Particle Systems with Applications. Probability and Its Applications (New York). New York: Springer.
  • [6] Del Moral, P. (2013). Mean Field Simulation for Monte Carlo Integration. Monographs on Statistics and Applied Probability 126. Boca Raton, FL: CRC Press.
  • [7] Del Moral, P. and Guionnet, A. (1999). Central limit theorem for nonlinear filtering and interacting particle systems. Ann. Appl. Probab. 9 275–297.
  • [8] Del Moral, P. and Guionnet, A. (2001). On the stability of interacting processes with applications to filtering and genetic algorithms. Ann. Inst. Henri Poincaré Probab. Stat. 37 155–194.
  • [9] Douc, R., Fort, G., Moulines, E. and Priouret, P. (2009). Forgetting the initial distribution for hidden Markov models. Stochastic Process. Appl. 119 1235–1256.
  • [10] Douc, R. and Moulines, E. (2008). Limit theorems for weighted samples with applications to sequential Monte Carlo methods. Ann. Statist. 36 2344–2376.
  • [11] Douc, R. and Moulines, E. (2012). Asymptotic properties of the maximum likelihood estimation in misspecified hidden Markov models. Ann. Statist. 40 2697–2732.
  • [12] Douc, R., Moulines, E. and Olsson, J. (2014). Long-term stability of sequential Monte Carlo methods under verifiable conditions. Ann. Appl. Probab. 24 1767–1802.
  • [13] Doucet, A., De Freitas, N. and Gordon, N., eds. (2001). Sequential Monte Carlo Methods in Practice. New York: Springer.
  • [14] Gordon, N., Salmond, D. and Smith, A.F. (1993). Novel approach to nonlinear/non-Gaussian Bayesian state estimation. IEE Proc., F, Radar Signal Process. 140 107–113.
  • [15] Hull, J. and White, A. (1987). The pricing of options on assets with stochastic volatilities. J. Finance 42 281–300.
  • [16] Jacob, P.E., Murray, L.M. and Rubenthaler, S. (2015). Path storage in the particle filter. Stat. Comput. 25 487–496.
  • [17] Kitagawa, G. and Sato, S. (2001). Monte Carlo smoothing and self-organising state-space model. In Sequential Monte Carlo Methods in Practice (A. Doucet, N. De Freitas and N. Gordon, eds.). Stat. Eng. Inf. Sci. 177–195. New York: Springer.
  • [18] Künsch, H.R. (2005). Recursive Monte Carlo filters: Algorithms and theoretical analysis. Ann. Statist. 33 1983–2021.
  • [19] Lee, A. and Whiteley, N. (2016). Variance estimation in particle filters. Preprint. Available at arXiv:1509.00394.
  • [20] Lindsten, F., Schön, T.B. and Olsson, J. (2011). An explicit variance reduction expression for the Rao–Blackwellised particle filter. In Proceedings of the 18th IFAC World Congress 11979–11984.
  • [21] Olsson, J., Cappé, O., Douc, R. and Moulines, E. (2008). Sequential Monte Carlo smoothing with application to parameter estimation in nonlinear state space models. Bernoulli 14 155–179.
  • [22] Olsson, J. and Ströjby, J. (2011). Particle-based likelihood inference in partially observed diffusion processes using generalised Poisson estimators. Electron. J. Stat. 5 1090–1122.
  • [23] Ristic, B., Arulampalam, M. and Gordon, A. (2004). Beyond Kalman Filters: Particle Filters for Target Tracking. Norwood: Artech House.