Electronic Journal of Statistics

Nonparametric estimation of covariance functions by model selection

Jérémie Bigot, Rolando Biscay, Jean-Michel Loubes, and Lilian Muñiz-Alvarez

Full-text: Open access

Abstract

We propose a model selection approach for covariance estimation of a stochastic process. Under very general assumptions, observing i.i.d replications of the process at fixed observation points, we construct an estimator of the covariance function by expanding the process onto a collection of basis functions. We study the non asymptotic property of this estimate and give a tractable way of selecting the best estimator among a possible set of candidates. The optimality of the procedure is proved via an oracle inequality which warrants that the best model is selected.

Article information

Source
Electron. J. Statist., Volume 4 (2010), 822-855.

Dates
First available in Project Euclid: 8 September 2010

Permanent link to this document
https://projecteuclid.org/euclid.ejs/1283952133

Digital Object Identifier
doi:10.1214/09-EJS493

Mathematical Reviews number (MathSciNet)
MR2684389

Zentralblatt MATH identifier
1329.62365

Subjects
Primary: 62G05: Estimation 62G20: Asymptotic properties

Keywords
Covariance estimation model selection oracle inequality

Citation

Bigot, Jérémie; Biscay, Rolando; Loubes, Jean-Michel; Muñiz-Alvarez, Lilian. Nonparametric estimation of covariance functions by model selection. Electron. J. Statist. 4 (2010), 822--855. doi:10.1214/09-EJS493. https://projecteuclid.org/euclid.ejs/1283952133


Export citation

References

  • [1] Adler, Robert J., An introduction to continuity, extrema, and related topics for general Gaussian processes. Institute of Mathematical Statistics Lecture Notes—Monograph Series, 12. Institute of Mathematical Statistics, Hayward, CA, 1990.
  • [2] Baraud, Yannick. Model selection for regression on a fixed design., Probab. Theory Related Fields, 117(4):467–493, 2000.
  • [3] Baraud, Yannick. Model selection for regression on a random design., ESAIM Probab. Statist., 6:127–146 (electronic), 2002.
  • [4] Bickel, P.J. and Levina, E. Regularized estimation of large covariance matrices., The Annals of Statistics, 36:199–227, 2008.
  • [5] Diaz-Frances E., Biscay, R. J. and Rodriguez, L. M. Cross-validation of covariance structures using the frobenius matrix distance as a discrepancy function., Journal of Statistical Computation and Simulation, 1997.
  • [6] Biscay, R., Jimenez, J.C. and Gonzalez, A. Smooth approximation of nonnegative definite kernels. In, Approximation and optimization in the Caribbean, II (Havana, 1993), volume 8 of Approx. Optim., pages 114–128. Lang, Frankfurt am Main, 1995.
  • [7] Comte, Fabienne. Adaptive estimation of the spectrum of a stationary Gaussian sequence., Bernoulli, 7(2):267–298, 2001.
  • [8] Cressie, Noel A.C., Statistics for spatial data. Wiley Series in Probability and Mathematical Statistics: Applied Probability and Statistics. John Wiley & Sons Inc., New York, 1993. Revised reprint of the 1991 edition, A Wiley-Interscience Publication.
  • [9] Perrin, O., Elogne, S.N. and Thomas-Agnan, C. Non parametric estimation of smooth stationary covariance functions by interpolation methods., Phd, 2003.
  • [10] Engl, H., Hanke, M. and Neubauer, A., Regularization of inverse problems, volume 375 of Mathematics and its Applications. Kluwer Academic Publishers Group, Dordrecht, 1996.
  • [11] Fan, Y., Fan, J. and Lv, J. High dimensional covariance matrix estimation using a factor model., Journal of Econometrics, 147:186–197, 2008.
  • [12] Gendre, Xavier. Simultaneous estimation of the mean and the variance in heteroscedastic Gaussian regression., Electron. J. Stat., 2 :1345–1372, 2008.
  • [13] Journel, A.G. Kriging in terms of projections., J. Internat. Assoc. Mathematical Geol., 9(6):563–586, 1977.
  • [14] Kollo, Tõnu and von Rosen, Dietrich., Advanced multivariate statistics with matrices, volume 579 of Mathematics and Its Applications (New York). Springer, Dordrecht, 2005.
  • [15] Levina, Elizaveta, Rothman, Adam and Ji Zhu. Sparse estimation of large covariance matrices via a nested Lasso penalty., Ann. Appl. Stat., 2(1):245–263, 2008.
  • [16] Loubes, J.-M. and Ludena, C. Adaptive complexity regularization for inverse problems., Electronic Journal Of Statistics, 2:661–677, 2008.
  • [17] Loubes, J.-M. and Ludena, C. Penalized estimators for non linear inverse problems., ESAIM Probab. Statist., 14:173–191, 2010.
  • [18] Lütkepohl, H., Handbook of matrices. John Wiley & Sons Ltd., Chichester, 1996.
  • [19] Nychka D.W., Matsuo, T. and Paul, D. Nonstationary covariance modeling for incomplete data: smoothed Monte-Carlo approach., preprint, 2008.
  • [20] Ramsey J.O. and Silverman B.W., Functional Data Analysis. Springer: NY, 2005.
  • [21] Raj Rao, N., Mingo, James A., Speicher, Roland and Edelman, Alan. Statistical eigen-inference from large Wishart matrices., Ann. Statist., 36(6) :2850–2885, 2008.
  • [22] Schäfer, Juliane and Strimmer, Korbinian. A shrinkage approach to large-scale covariance matrix estimation and implications for functional genomics., Stat. Appl. Genet. Mol. Biol., 4:Art. 32, 28 pp. (electronic), 2005.
  • [23] Stein, Michael L., Interpolation of spatial data. Some theory for kriging. Springer Series in Statistics. New York, NY: Springer. xvii, 247 p., 1999.