• Bernoulli
  • Volume 20, Number 4 (2014), 1879-1929.

Particle-kernel estimation of the filter density in state-space models

Dan Crisan and Joaquín Míguez

Full-text: Open access


Sequential Monte Carlo (SMC) methods, also known as particle filters, are simulation-based recursive algorithms for the approximation of the a posteriori probability measures generated by state-space dynamical models. At any given time $t$, a SMC method produces a set of samples over the state space of the system of interest (often termed “particles”) that is used to build a discrete and random approximation of the posterior probability distribution of the state variables, conditional on a sequence of available observations. One potential application of the methodology is the estimation of the densities associated to the sequence of a posteriori distributions. While practitioners have rather freely applied such density approximations in the past, the issue has received less attention from a theoretical perspective. In this paper, we address the problem of constructing kernel-based estimates of the posterior probability density function and its derivatives, and obtain asymptotic convergence results for the estimation errors. In particular, we find convergence rates for the approximation errors that hold uniformly on the state space and guarantee that the error vanishes almost surely as the number of particles in the filter grows. Based on this uniform convergence result, we first show how to build continuous measures that converge almost surely (with known rate) toward the posterior measure and then address a few applications. The latter include maximum a posteriori estimation of the system state using the approximate derivatives of the posterior density and the approximation of functionals of it, for example, Shannon’s entropy.

Article information

Bernoulli, Volume 20, Number 4 (2014), 1879-1929.

First available in Project Euclid: 19 September 2014

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

density estimation Markov systems particle filtering sequential Monte Carlo state-space models stochastic filtering


Crisan, Dan; Míguez, Joaquín. Particle-kernel estimation of the filter density in state-space models. Bernoulli 20 (2014), no. 4, 1879--1929. doi:10.3150/13-BEJ545.

Export citation


  • [1] Abraham, C., Biau, G. and Cadre, B. (2004). On the asymptotic properties of a simple estimate of the mode. ESAIM Probab. Stat. 8 1–11 (electronic).
  • [2] Appel, M.J., LaBarre, R. and Radulović, D. (2003). On accelerated random search. SIAM J. Optim. 14 708–731 (electronic).
  • [3] Bain, A. and Crisan, D. (2009). Fundamentals of Stochastic Filtering. Stochastic Modelling and Applied Probability 60. New York: Springer.
  • [4] Beirlant, J., Dudewicz, E.J., Györfi, L. and van der Meulen, E.C. (1997). Nonparametric entropy estimation: An overview. Int. J. Math. Stat. Sci. 6 17–39.
  • [5] Brewer, M.J. (2000). A Bayesian model for local smoothing in kernel density estimation. Statist. Comput. 10 299–309.
  • [6] Corana, A., Marchesi, M., Martini, C. and Ridella, S. (1987). Minimizing multimodal functions of continuous variables with the “simulated annealing” algorithm. ACM Trans. Math. Software 13 262–280.
  • [7] Cover, T.M. and Thomas, J.A. (1991). Elements of Information Theory. Wiley Series in Telecommunications. New York: Wiley.
  • [8] Crisan, D. (2001). Particle filters – a theoretical perspective. In Sequential Monte Carlo Methods in Practice (A. Doucet, N. de Freitas and N. Gordon, eds.). Stat. Eng. Inf. Sci. 17–41. New York: Springer.
  • [9] Crisan, D., Del Moral, P. and Lyons, T. (1999). Discrete filtering using branching and interacting particle systems. Markov Process. Related Fields 5 293–318.
  • [10] Crisan, D. and Doucet, A. (2000). Convergence of sequential Monte Carlo methods. Technical Report CUED/FINFENG/TR381, Cambridge University.
  • [11] Crisan, D. and Doucet, A. (2002). A survey of convergence results on particle filtering methods for practitioners. IEEE Trans. Signal Process. 50 736–746.
  • [12] Dean, T.A., Singh, S.S., Jasra, A. and Peters, G.W. (2011). Parameter estimation for hidden Markov models with intractable likelihoods. Available at arXiv:1103.5399v1 [math.ST].
  • [13] Del Moral, P. (1996). Non-linear filtering using random particles. Theory Probab. Appl. 40 690–701.
  • [14] Del Moral, P. (1996). Nonlinear filtering: Interacting particle solution. Markov Process. Related Fields 2 555–579.
  • [15] Del Moral, P. (2004). Feynman–Kac Formulae: Genealogical and Interacting Particle Systems with Applications. Probability and Its Applications (New York). New York: Springer.
  • [16] Del Moral, P. and Miclo, L. (2000). Branching and interacting particle systems approximations of Feynman–Kac formulae with applications to non-linear filtering. In Séminaire de Probabilités, XXXIV. Lecture Notes in Math. 1729 1–145. Berlin: Springer.
  • [17] Del Moral, P., Doucet, A. and Singh, S. (2011). Uniform stability of a particle approximation of the optimal filter derivative. Available at arXiv:1106.2525v1 [math.ST].
  • [18] Devroye, L. and Györfi, L. (1985). Nonparametric Density Estimation: The $L{_{1}}$ View. Wiley Series in Probability and Mathematical Statistics: Tracts on Probability and Statistics. New York: Wiley.
  • [19] Douc, R., Cappé, O. and Moulines, E. (2005). Comparison of resampling schemes for particle filtering. In Proceedings of the 4th International Symposium on Image and Signal Processing and Analysis 64–69.
  • [20] Doucet, A., de Freitas, N. and Gordon, N. (2001). An introduction to sequential Monte Carlo methods. In Sequential Monte Carlo Methods in Practice (A. Doucet, N. de Freitas and N. Gordon, eds.). Statistics for Engineering and Information Science. New York: Springer.
  • [21] Doucet, A., de Freitas, N. and Gordon, N. (2001). Sequential Monte Carlo Methods in Practice (A. Doucet, N. de Freitas and N. Gordon, eds.). Statistics for Engineering and Information Science. New York: Springer.
  • [22] Doucet, A., Godsill, S. and Andrieu, C. (2000). On sequential Monte Carlo sampling methods for Bayesian filtering. Statist. Comput. 10 197–208.
  • [23] Duong, T. and Hazelton, M.L. (2005). Cross-validation bandwidth matrices for multivariate kernel density estimation. Scand. J. Stat. 32 485–506.
  • [24] Frenkel, L. and Feder, M. (1999). Recursive expectation–maximization (EM) algorithms for time-varying parameters with applications to multiple target tracking. IEEE Trans. Signal Process. 47 306–320.
  • [25] Gauvain, J.L. and Lee, C.H. (1992). Bayesian learning for hidden Markov model with Gaussian mixture state observation densities. Speech Commun. 11 205–213.
  • [26] Godsill, S., Doucet, A. and West, M. (2001). Maximum a posteriori sequence estimation using Monte Carlo particle filters. Ann. Inst. Statist. Math. 53 82–96.
  • [27] Gordon, N., Salmond, D. and Smith, A.F.M. (1993). Novel approach to nonlinear and non-Gaussian Bayesian state estimation. IEE Proc. F 140 107–113.
  • [28] Hall, P. and Kang, K.H. (2001). Bootstrapping nonparametric density estimators with empirically chosen bandwidths. Ann. Statist. 29 1443–1468.
  • [29] Hedar, A.R. and Fukushima, M. (2006). Derivative-free filter simulated annealing method for constrained continuous global optimization. J. Global Optim. 35 521–549.
  • [30] Heine, K. and Crisan, D. (2008). Uniform approximations of discrete-time filters. Adv. in Appl. Probab. 40 979–1001.
  • [31] Hu, X.L., Schön, T.B. and Ljung, L. (2008). A basic convergence result for particle filtering. IEEE Trans. Signal Process. 56 1337–1348.
  • [32] Kalman, R.E. (1960). A new approach to linear filtering and prediction problems. J. Basic Eng. 82 35–45.
  • [33] Kitagawa, G. (1996). Monte Carlo filter and smoother for non-Gaussian nonlinear state space models. J. Comput. Graph. Statist. 5 1–25.
  • [34] Künsch, H.R. (2005). Recursive Monte Carlo filters: Algorithms and theoretical analysis. Ann. Statist. 33 1983–2021.
  • [35] Le Gland, F. and Oudjane, N. (2004). Stability and uniform approximation of nonlinear filters using the Hilbert metric and application to particle filters. Ann. Appl. Probab. 14 144–187.
  • [36] Liu, J.S. and Chen, R. (1998). Sequential Monte Carlo methods for dynamic systems. J. Amer. Statist. Assoc. 93 1032–1044.
  • [37] Logothetis, A. and Krishnamurthy, V. (1999). Expectation maximization algorithms for MAP estimation of jump Markov linear systems. IEEE Trans. Signal Process. 47 2139–2156.
  • [38] Míguez, J., Crisan, D. and Djurić, P.M. (2013). On the convergence of two sequential Monte Carlo methods for maximum a posteriori sequence estimation and stochastic global optimization. Stat. Comput. 23 91–107.
  • [39] Musso, C., Oudjane, N. and Le Gland, F. (2001). Improving regularised particle filters. In Sequential Monte Carlo Methods in Practice (A. Doucet, N. de Freitas and N. Gordon, eds.). Stat. Eng. Inf. Sci. 247–271. New York: Springer.
  • [40] Najim, K., Ikonen, E. and Del Moral, P. (2006). Open-loop regulation and tracking control based on a genealogical decision tree. Neural Comput. Appl. 15 339–349.
  • [41] Nilsson, M. and Kleijn, W.B. (2007). On the estimation of differential entropy from data located on embedded manifolds. IEEE Trans. Inform. Theory 53 2330–2341.
  • [42] Silverman, B.W. (1986). Density Estimation for Statistics and Data Analysis. Monographs on Statistics and Applied Probability. London: Chapman & Hall.
  • [43] Simonoff, J.S. (1996). Smoothing Methods in Statistics. Springer Series in Statistics. New York: Springer.
  • [44] Van Hulle, M.M. (2005). Edgeworth approximation of multivariate differential entropy. Neural Comput. 17 1903–1910.
  • [45] Wand, M.P. and Jones, M.C. (1995). Kernel Smoothing. Monographs on Statistics and Applied Probability 60. London: Chapman & Hall.
  • [46] West, M. (1993). Approximating posterior distributions by mixtures. J. R. Stat. Soc. Ser. B Stat. Methodol. 55 409–422.
  • [47] Zhang, X., King, M.L. and Hyndman, R.J. (2006). A Bayesian approach to bandwidth selection for multivariate kernel density estimation. Comput. Statist. Data Anal. 50 3009–3031.