The Annals of Statistics

The BLUE in continuous-time regression models with correlated errors

Holger Dette, Andrey Pepelyshev, and Anatoly Zhigljavsky

Full-text: Access denied (no subscription detected)

We're sorry, but we are unable to provide you with the full text of this article because we are not able to identify you as a subscriber. If you have a personal subscription to this journal, then please login. If you are already logged in, then you may need to update your profile to register your subscription. Read more about accessing full-text

Abstract

In this paper, the problem of best linear unbiased estimation is investigated for continuous-time regression models. We prove several general statements concerning the explicit form of the best linear unbiased estimator (BLUE), in particular when the error process is a smooth process with one or several derivatives of the response process available for construction of the estimators. We derive the explicit form of the BLUE for many specific models including the cases of continuous autoregressive errors of order two and integrated error processes (such as integrated Brownian motion). The results are illustrated on many examples.

Article information

Source
Ann. Statist., Volume 47, Number 4 (2019), 1928-1959.

Dates
Received: October 2017
Revised: May 2018
First available in Project Euclid: 21 May 2019

Permanent link to this document
https://projecteuclid.org/euclid.aos/1558425635

Digital Object Identifier
doi:10.1214/18-AOS1734

Mathematical Reviews number (MathSciNet)
MR3953440

Zentralblatt MATH identifier
07082275

Subjects
Primary: 62M10: Time series, auto-correlation, regression, etc. [See also 91B84]
Secondary: 62M09: Non-Markovian processes: estimation

Keywords
Linear regression correlated observations signed measures optimal design BLUE AR processes continuous autoregressive model

Citation

Dette, Holger; Pepelyshev, Andrey; Zhigljavsky, Anatoly. The BLUE in continuous-time regression models with correlated errors. Ann. Statist. 47 (2019), no. 4, 1928--1959. doi:10.1214/18-AOS1734. https://projecteuclid.org/euclid.aos/1558425635


Export citation

References

  • Anderes, E. (2010). On the consistent separation of scale and variance for Gaussian random fields. Ann. Statist. 38 870–893.
  • Anderson, T. W. (1970). Efficient estimation of regression coefficients in time series. Technical report, DTIC document.
  • Berlinet, A. and Thomas-Agnan, C. (2011). Reproducing Kernel Hilbert Spaces in Probability and Statistics. Kluwer, Boston.
  • Brockwell, P. J. (2001). Continuous-time ARMA processes. In Stochastic Processes: Theory and Methods. Handbook of Statist. 19 249–276. North-Holland, Amsterdam.
  • Brockwell, P., Davis, R. and Yang, Y. (2007). Continuous-time Gaussian autoregression. Statist. Sinica 17 63.
  • Dette, H., Konstantinou, M. and Zhigljavsky, A. (2017). A new approach to optimal designs for correlated observations. Ann. Statist. 45 1579–1608.
  • Dette, H., Pepelyshev, A. and Zhigljavsky, A. (2013). Optimal design for linear models with correlated observations. Ann. Statist. 41 143–176.
  • Dette, H., Pepelyshev, A. and Zhigljavsky, A. (2016). Optimal designs in regression with correlated errors. Ann. Statist. 44 113–152.
  • Dette, H., Pepelyshev, A. and Zhigljavsky, A. (2019). Supplement to “The BLUE in continuous-time regression models with correlated errors.” DOI:10.1214/18-AOS1734SUPP.
  • Fedorov, V. V. and Müller, W. G. (2007). Optimum design for correlated fields via covariance kernel expansions. In MODa 8—Advances in Model-Oriented Design and Analysis. 57–66. Physica-Verlag/Springer, Heidelberg.
  • Freeden, W. (1999). Multiscale Modelling of Spaceborne Geodata. B. G. Teubner, Stuttgart.
  • Gantmacher, F. R. (1959). The Theory of Matrices. Vols. 1, 2. Chelsea Publishing Co., New York. Translated by K. A. Hirsch.
  • Grenander, U. (1950). Stochastic processes and statistical inference. Ark. Mat. 1 195–277.
  • Grenander, U. (1954). On the estimation of regression coefficients in the case of an autocorrelated disturbance. Ann. Math. Stat. 25 252–272.
  • Hájek, J. (1956). Linear estimation of the mean value of a stationary random process with convex correlation function. Czechoslovak Math. J. 6 94–117.
  • Hannan, E. J. (1975). Linear regression in continuous time. J. Aust. Math. Soc. 19 146–159.
  • He, S. W. and Wang, J. G. (1989). On embedding a discrete-parameter ARMA model in a continuous-parameter ARMA model. J. Time Series Anal. 10 315–323.
  • Kanwal, R. P. (1997). Linear Integral Equations, 2nd ed. Birkhäuser, Inc., Boston, MA.
  • Kholevo, A. S. (1969). On estimates of regression coefficients. Theory Probab. Appl. 14 79–104.
  • Loh, W.-L. and Lam, T.-K. (2000). Estimating structured correlation matrices in smooth Gaussian random field models. Ann. Statist. 28 880–904.
  • Morris, M. D., Mitchell, T. J. and Ylvisaker, D. (1993). Bayesian design and analysis of computer experiments: Use of derivatives in surface prediction. Technometrics 35 243–255.
  • Müller, W. G. and Pázman, A. (2003). Measures for designs in experiments with correlated errors. Biometrika 90 423–434.
  • Näther, W. (1985). Effective Observation of Random Fields. Teubner Verlagsgesellschaft, Leipzig.
  • Osborne, M. A., Garnett, R. and Roberts, S. J. (2009). Gaussian processes for global optimization. In 3rd International Conference on Learning and Intelligent Optimization (LION3) 1–15. Citeseer.
  • Parzen, E. (1961). An appproach to time series analysis. Ann. Math. Stat. 32 951–989.
  • Pronzato, L. and Müller, W. G. (2012). Design of computer experiments: Space filling and beyond. Stat. Comput. 22 681–701.
  • Ramm, A. G. (1980). Theory and Applications of Some New Classes of Integral Equations. Springer, New York.
  • Rasmussen, C. E. and Williams, C. K. I. (2006). Gaussian Processes for Machine Learning. MIT Press, Cambridge, MA.
  • Ritter, K. (2000). Average-Case Analysis of Numerical Problems. Lecture Notes in Math. 1733. Springer, Berlin.
  • Ritter, K., Wasilkowski, G. W. and Woźniakowski, H. (1995). Multivariate integration and approximation for random fields satisfying Sacks–Ylvisaker conditions. Ann. Appl. Probab. 5 518–540.
  • Rosenblatt, M. (1956). Some regression problems in time series analysis. In Proceedings of the Third Berkeley Symposium on Mathematical Statistics and Probability, 19541955, Vol. I 165–186. Univ. California Press, Berkeley.
  • Sacks, J. and Ylvisaker, N. D. (1966). Designs for regression problems with correlated errors. Ann. Math. Stat. 37 66–89.
  • Sacks, J. and Ylvisaker, D. (1968). Designs for regression problems with correlated errors; many parameters. Ann. Math. Stat. 39 49–69.
  • Sacks, J. and Ylvisaker, D. (1970). Designs for regression problems with correlated errors. III. Ann. Math. Stat. 41 2057–2074.
  • Sejdinovic, D., Sriperumbudur, B., Gretton, A. and Fukumizu, K. (2013). Equivalence of distance-based and RKHS-based statistics in hypothesis testing. Ann. Statist. 41 2263–2291.
  • Stein, M. L. (2012). Interpolation of Spatial Data: Some Theory for Kriging. Springer, New York.
  • Székely, G. J. and Rizzo, M. L. (2013). Energy statistics: A class of statistics based on distances. J. Statist. Plann. Inference 143 1249–1272.
  • Yaglom, A. M. (1987). Correlation Theory of Stationary and Related Random Functions. Vol. I: Basic Results. Springer, New York.

Supplemental materials

  • Supplement to “The BLUE in continuous-time regression models with correlated errors”. We exemplarily demonstrate that the covariance matrix of the BLUE for the model (1.1) with observations on the interval can be obtained as a limit of the covariance matrices of the BLUE in the discrete regression model (1.2) with observations at equidistant points and a discrete $\operatorname{AR}(2)$ error process.