Open Access
April, 1970 Gauss-Markov Estimation for Multivariate Linear Models: A Coordinate Free Approach
Morris L. Eaton
Ann. Math. Statist. 41(2): 528-538 (April, 1970). DOI: 10.1214/aoms/1177697093

Abstract

The coordinate free (geometric) approach to univariate linear models has added both insight and understanding to the problems of Gauss Markov (GM) estimation and hypothesis testing. One of the initial papers emphasizing the geometric aspects of univariate linear models is Kruskal's (1961). The coordinate free approach is used in this paper to treat GM estimation in a multivariate analysis context. In contrast to the univariate situation, a central question for multivariate linear models is the existence of GM estimates. Of course, it is the more complicated covariance structure in the multivariate case that creates the concern over the existence of GM estimates. As the emphasis is on GM estimation, first and second moment assumptions (as opposed to distributional assumptions) play the key role. Classical results for the univariate linear model are outlined in Section 1. In addition, a recent theorem due to Kruskal (1968) concerning the equality of GM and Least Squares (LS) estimates is discussed. A minor modification of Kruskal's result gives a very useful, necessary and sufficient condition for the existence of GM estimators for arbitrary covariance structures and a fixed regression manifold. In Section 2, the outer product of two vectors and the Kronecker product of linear transformations is discussed and applied to describe the covariance structure of a random matrix. This application includes the case of a random sample from a multivariate population with covariance matrix $\Sigma > 0 ("\Sigma > 0"$ means that $\Sigma$ is positive definite). The question of GM estimation in the standard multivariate linear model is taken up in Section 3. This model is described as follows: a random matrix $Y: n \times p$, whose rows are uncorrelated and each row has a covariance matrix $\Sigma > 0$, is observed. The mean matrix of $Y, \mu$, is assumed to have the form $\mu = ZB$ where $Z: n \times q$ is known and of rank $q$, and $B: q \times p$ is a matrix of regression coefficients. For this model, GM estimators for $\mu$ and $B$ exist and are well known (see Anderson (1958) chapter 8). The main result in Section 3 establishes a converse to this classical result. Explicitly, let $Y$ have the covariance structure as above and assume $\Omega$ is a fixed regression manifold. It is shown that if a GM estimator for $\mu\in\Omega$ exists, then each element $\mu\in\Omega$ can be written as $\mu = ZB$ where $Z: n \times q$ is fixed and $B: q \times p$ ranges over all $q \times p$ real matrices. The results in Section 4 and Section 5 are similar to the main result of Section 3. A complete description of all regression manifolds for which GM estimators exist is given for two different kinds of covariance assumptions concerning $\Sigma$ ($\Sigma$ as above). In Section 4, it is assumed that $\Sigma$ has a block diagonal form with two blocks. Section 5 is concerned with the case when $\Sigma$ has the so-called intra-class correlation form.

Citation

Download Citation

Morris L. Eaton. "Gauss-Markov Estimation for Multivariate Linear Models: A Coordinate Free Approach." Ann. Math. Statist. 41 (2) 528 - 538, April, 1970. https://doi.org/10.1214/aoms/1177697093

Information

Published: April, 1970
First available in Project Euclid: 27 April 2007

zbMATH: 0195.20101
MathSciNet: MR264818
Digital Object Identifier: 10.1214/aoms/1177697093

Rights: Copyright © 1970 Institute of Mathematical Statistics

Vol.41 • No. 2 • April, 1970
Back to Top