## The Annals of Statistics

- Ann. Statist.
- Volume 18, Number 3 (1990), 1116-1138.

### Bounds on the Efficiency of Linear Predictions Using an Incorrect Covariance Function

#### Abstract

Suppose $z(\cdot)$ is a random process defined on a bounded set $R \subset \mathbb{R}^1$ with finite second moments. Consider the behavior of linear predictions based on $z(t_1), \ldots, z(t_n)$, where $t_1, t_2, \cdots$ is a dense sequence of points in $R$. Stein showed that if the second-order structure used to generate the predictions is incorrect but compatible with the correct second-order structure, the obtained predictions are uniformly asymptotically optimal as $n \rightarrow \infty$. In the present paper, a general method is described for obtaining rates of convergence when the covariance function is misspecified but compatible with the correct covariance function. When $z(\cdot)$ is Gaussian, these bounds are related to the entropy distance (the symmetrized Kullback divergence) between the measures for the random field under the actual and presumed covariance functions. Explicit bounds are given when $R = \lbrack 0, 1\rbrack$ and $z(\cdot)$ is stationary with spectral density of the form $f(\lambda) = (a^2 + \lambda^2)^{-p}$, where $p$ is a known positive integer and $a$ is the parameter that is misspecified. More precise results are given in the case $p = 1$. An application of this result implies that equally spaced observations are asymptotically optimal in the sense used by Sacks and Ylvisaker in terms of maximizing the Kullback divergence between the actual and presumed models when $z(\cdot)$ is Gaussian.

#### Article information

**Source**

Ann. Statist. Volume 18, Number 3 (1990), 1116-1138.

**Dates**

First available in Project Euclid: 12 April 2007

**Permanent link to this document**

https://projecteuclid.org/euclid.aos/1176347742

**Digital Object Identifier**

doi:10.1214/aos/1176347742

**Mathematical Reviews number (MathSciNet)**

MR1062701

**Zentralblatt MATH identifier**

0749.62061

**JSTOR**

links.jstor.org

**Subjects**

Primary: 62M20: Prediction [See also 60G25]; filtering [See also 60G35, 93E10, 93E11]

Secondary: 41A25: Rate of convergence, degree of approximation 60G60: Random fields

**Keywords**

Spatial statistics approximation in Hilbert spaces Kullback divergence design for time series experiments

#### Citation

Stein, Michael L. Bounds on the Efficiency of Linear Predictions Using an Incorrect Covariance Function. Ann. Statist. 18 (1990), no. 3, 1116--1138. doi:10.1214/aos/1176347742. https://projecteuclid.org/euclid.aos/1176347742