Open Access
January, 1973 Asymptotically Efficient Estimation of Covariance Matrices with Linear Structure
T. W. Anderson
Ann. Statist. 1(1): 135-141 (January, 1973). DOI: 10.1214/aos/1193342389

Abstract

One or more observations are made on a random vector, whose covariance matrix may be a linear combination of known symmetric matrices and whose mean vector may be a linear combination of known vectors; the coefficients of the linear combinations are unknown parameters to be estimated. Under the assumption of normality, equations are developed for the maximum likelihood estimates. These equations may be solved by an iterative method; in each step a set of linear equations is solved. If consistent estimates of $\sigma_0, \sigma_1, \cdots, \sigma_m$ are used to obtain the coefficients of the linear equations, the solution of these equations is asymptotically efficient as the number of observations on the random vector tends to infinity. This result is a consequence of a theorem that the solution of the generalized least squares equations is asymptotically efficient if a consistent estimate of the covariance matrix is used. Applications are made to the components of variance model in the analysis of variance and the finite moving average model in time series analysis.

Citation

Download Citation

T. W. Anderson. "Asymptotically Efficient Estimation of Covariance Matrices with Linear Structure." Ann. Statist. 1 (1) 135 - 141, January, 1973. https://doi.org/10.1214/aos/1193342389

Information

Published: January, 1973
First available in Project Euclid: 25 October 2007

zbMATH: 0296.62022
MathSciNet: MR331612
Digital Object Identifier: 10.1214/aos/1193342389

Keywords: asymptotically efficient estimates , Covariance matrices , iterative computations , maximum likelihood estimates , Moving average process , multivariate normal distribution

Rights: Copyright © 1973 Institute of Mathematical Statistics

Vol.1 • No. 1 • January, 1973
Back to Top