• Bernoulli
  • Volume 23, Number 2 (2017), 990-1021.

Nonparametric regression on hidden $\Phi$-mixing variables: Identifiability and consistency of a pseudo-likelihood based estimation procedure

Thierry Dumont and Sylvain Le Corff

Full-text: Access denied (no subscription detected)

We're sorry, but we are unable to provide you with the full text of this article because we are not able to identify you as a subscriber. If you have a personal subscription to this journal, then please login. If you are already logged in, then you may need to update your profile to register your subscription. Read more about accessing full-text


This paper outlines a new nonparametric estimation procedure for unobserved $\Phi$-mixing processes. It is assumed that the only information on the stationary hidden states $(X_{k})_{k\ge0}$ is given by the process $(Y_{k})_{k\ge0}$, where $Y_{k}$ is a noisy observation of $f_{\star}(X_{k})$. The paper introduces a maximum pseudo-likelihood procedure to estimate the function $f_{\star}$ and the distribution $\nu_{b,\star}$ of $(X_{0},\ldots,X_{b-1})$ using blocks of observations of length $b$. The identifiability of the model is studied in the particular cases $b=1$ and $b=2$ and the consistency of the estimators of $f_{\star}$ and of $\nu_{b,\star}$ as the number of observations grows to infinity is established.

Article information

Bernoulli, Volume 23, Number 2 (2017), 990-1021.

Received: February 2014
Revised: July 2015
First available in Project Euclid: 4 February 2017

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

identifiability maximum likelihood nonparametric estimation state space model


Dumont, Thierry; Le Corff, Sylvain. Nonparametric regression on hidden $\Phi$-mixing variables: Identifiability and consistency of a pseudo-likelihood based estimation procedure. Bernoulli 23 (2017), no. 2, 990--1021. doi:10.3150/15-BEJ767.

Export citation


  • [1] Adams, R.A. and Fournier, J.J.F. (2003). Sobolev Spaces, 2nd ed. Pure and Applied Mathematics (Amsterdam) 140. Amsterdam: Elsevier/Academic Press.
  • [2] Ambrosetti, A. and Prodi, G. (1995). A Primer of Nonlinear Analysis. Cambridge Studies in Advanced Mathematics 34. Cambridge: Cambridge Univ. Press.
  • [3] Carroll, R.J. and Hall, P. (1988). Optimal rates of convergence for deconvolving a density. J. Amer. Statist. Assoc. 83 1184–1186.
  • [4] Comte, F. and Lacour, C. (2011). Data-driven density estimation in the presence of additive noise with unknown distribution. J. R. Stat. Soc. Ser. B. Stat. Methodol. 73 601–627.
  • [5] Comte, F. and Taupin, M.-L. (2007). Adaptive estimation in a nonparametric regression model with errors-in-variables. Statist. Sinica 17 1065–1090.
  • [6] de Boor, C. and Lynch, R.E. (1966). On splines and their minimum properties. J. Math. Mech. 15 953–969.
  • [7] Dedecker, J., Doukhan, P., Lang, G., León, J.R., Louhichi, S. and Prieur, C. (2009). Weak dependence: With examples and applications. (Lecture notes in statistics). AStA Adv. Stat. Anal. 93 119–120.
  • [8] Dempster, A.P., Laird, N.M. and Rubin, D.B. (1977). Maximum likelihood from incomplete data via the EM algorithm. J. Roy. Statist. Soc. Ser. B 39 1–38.
  • [9] Doukhan, P., Massart, P. and Rio, E. (1995). Invariance principles for absolutely regular empirical processes. Ann. Inst. Henri Poincaré Probab. Stat. 31 393–427.
  • [10] Dumont, T. and Le Corff, S. (2014). Simultaneous localization and mapping problem in wireless sensor networks. Signal Processing 101 192–203.
  • [11] Evans, L.C. and Gariepy, R.F. (1992). Measure Theory and Fine Properties of Functions. Studies in Advanced Mathematics. Boca Raton, FL: CRC Press.
  • [12] Fan, J. and Truong, Y.K. (1993). Nonparametric regression with errors in variables. Ann. Statist. 21 1900–1925.
  • [13] Gassiat, E. and van Handel, R. (2013). Consistent order estimation and minimal penalties. IEEE Trans. Inform. Theory 59 1115–1128.
  • [14] Hastie, T.J. and Tibshirani, R.J. (1990). Generalized Additive Models, Vol. 43. Boca Raton: CRC Press.
  • [15] Ioannides, D.A. and Alevizos, P.D. (1997). Nonparametric regression with errors in variables and applications. Statist. Probab. Lett. 32 35–43.
  • [16] Koo, J.-Y. (1999). Logspline deconvolution in Besov space. Scand. J. Stat. 26 73–86.
  • [17] Koo, J.-Y. and Lee, K.-W. (1998). $B$-spline estimation of regression functions with errors in variable. Statist. Probab. Lett. 40 57–66.
  • [18] Lacour, C. (2006). Rates of convergence for nonparametric deconvolution. C. R. Math. Acad. Sci. Paris 342 877–882.
  • [19] Lacour, C. (2008). Least squares type estimation of the transition density of a particular hidden Markov chain. Electron. J. Stat. 2 1–39.
  • [20] Lacour, C. (2008). Adaptive estimation of the transition density of a particular hidden Markov chain. J. Multivariate Anal. 99 787–814.
  • [21] Massart, P. (2007). Concentration Inequalities and Model Selection. Lecture Notes in Math. 1896. Berlin: Springer.
  • [22] Meyn, S.P. and Tweedie, R.L. (1993). Markov Chains and Stochastic Stability. Communications and Control Engineering. Cambridge: Cambridge Univ. Press.
  • [23] Nickl, R. and Pötscher, B.M. (2007). Bracketing metric entropy rates and empirical central limit theorems for function classes of Besov- and Sobolev-type. J. Theoret. Probab. 20 177–199.
  • [24] Samson, P.-M. (2000). Concentration of measure inequalities for Markov chains and $\Phi$-mixing processes. Ann. Probab. 28 416–461.
  • [25] Van De Geer, S.A. (2009). Empirical Processes in M-Estimation. Cambridge Series in Statistical and Probabilistic Mathematics. Cambridge: Cambridge Univ. Press.
  • [26] van der Vaart, A.W. and Wellner, J.A. (1996). Weak Convergence and Empirical Processes. Springer Series in Statistics. New York: Springer.
  • [27] Whitney, H. (1986). Differentiable manifolds. Ann. of Math. 37 645–680.