One or more observations are made on a random vector, whose covariance matrix may be a linear combination of known symmetric matrices and whose mean vector may be a linear combination of known vectors; the coefficients of the linear combinations are unknown parameters to be estimated. Under the assumption of normality, equations are developed for the maximum likelihood estimates. These equations may be solved by an iterative method; in each step a set of linear equations is solved. If consistent estimates of $\sigma_0, \sigma_1, \cdots, \sigma_m$ are used to obtain the coefficients of the linear equations, the solution of these equations is asymptotically efficient as the number of observations on the random vector tends to infinity. This result is a consequence of a theorem that the solution of the generalized least squares equations is asymptotically efficient if a consistent estimate of the covariance matrix is used. Applications are made to the components of variance model in the analysis of variance and the finite moving average model in time series analysis.
"Asymptotically Efficient Estimation of Covariance Matrices with Linear Structure." Ann. Statist. 1 (1) 135 - 141, January, 1973. https://doi.org/10.1214/aos/1193342389