Abstract
The basic problem dealt with here is the estimation of linear regression parameters from a set of observations obscured by correlated noise. Two well-known solutions to this problem are minimum variance (or Markov, MV) and least squares (LS) estimation. Although MV is, by definition, an optimal method, LS possesses two distinct advantages which cause it to be used more frequently in practice: (1) computational simplicity and (2) the fact that it does not require knowledge of the correlation matrix of the noise, which in many cases is actually unknown. Therefore, a comparative study of these two methods to determine how much is lost by use of LS instead of MV is of value. In this connection, Grenander and Rosenblatt [1] have derived important asymptotic properties of LS and MV estimates when the noise is a stationary random process. The approach here is somewhat different from theirs, and no assumption regarding stationarity is made. The essence of this analysis is to re-formulate LS and MV in terms of the spectrum of the noise correlation matrix. This procedure offers some new insights into the nature of LS and MV and the differences between them. For example, it shows when they yield the same result and when LS performs worst compared with MV. It also exhibits the roles played by the maximum and minimum eigenvalues of the noise correlation matrix in setting bounds on the covariance matrices of both LS and MV estimates.
Citation
T. A. Magness. J. B. McGuire. "Comparison of Least Squares and Minimum Variance Estimates of Regression Parameters." Ann. Math. Statist. 33 (2) 462 - 470, June, 1962. https://doi.org/10.1214/aoms/1177704573
Information