Translator Disclaimer
December, 1969 Expression of Variance-Component Estimators as Linear Combinations of Independent Noncentral Chi-Square Variates
David A. Harville
Ann. Math. Statist. 40(6): 2189-2194 (December, 1969). DOI: 10.1214/aoms/1177697297

## Abstract

It is well known that any quadratic form in random variables whose joint distribution is nondegenerate multivariate normal is distributed as a linear combination of independent noncentral chi-square variables. Thus, when normality holds, quadratic estimators of the components of variance associated with Eisenhart's  Model II are so distributed, even when the design is unbalanced. The problem considered here is that of determining the appropriate linear combinations for given estimators and designs. The expressing of quadratic variance-component estimators as linear combinations of independent noncentral chi-squares is useful, for several reasons, in studying the distributions of the estimators: (i) it permits application of results like those of Press  on the distributions of such linear combinations; (ii) it leads to Monte-Carlo techniques for approximating the distributions of these estimators which are more efficient than those heretofore used; and (iii) it may give some insight into the ways in which various types of imbalance affect the distributions of the estimators. We now define the transformation to be used in expressing quadratic forms as linear combinations of independent noncentral chi-squares. Let $\beta$ represent an $n \times n$ real symmetric matrix. Take $\mathbf{y}$ to be an $n \times 1$ random vector having the multivariate normal distribution with mean $\mathbf{u}$ and symmetric, nonsingular variance-covariance matrix $\mathbf{V}$. Clearly, the $n \times 1$ random vector $\mathbf{z}$ defined by the linear transformation $\mathbf{z} = \mathbf{W}'\mathbf{S}^{-1}\mathbf{C}'\mathbf{y}$; where $\mathbf{C}$ is an $n \times n$ orthogonal matrix whose columns are eigenvectors of $\mathbf{V}, \mathbf{S}$ is an $n \times n$ diagonal matrix whose $i$th diagonal element is the square root of the eigenvalue corresponding to the eigenvector represented by the $i$th column of $\mathbf{C}$, and $\mathbf{W}$ is an $n \times n$ orthogonal matrix whose columns are eigenvectors of the necessarily symmetric matrix $\mathbf{P} = \mathbf{SC}'\beta\mathbf{CS}$; has the multivariate normal distribution with \begin{equation*} \tag{1} E\{\mathbf{z}\} = \mathbf{W}'\mathbf{S}^{-1}\mathbf{C}'\mathbf{u}\end{equation*} and variance-covariance matrix $\mathbf{I}$ (the identity matrix). Then the distribution of the quadratic form $\mathbf{y}'\beta\mathbf{y}$ is the same as that of $\mathbf{z}'\mathbf{Dz}$, where $\mathbf{D} = \mathbf{W}'\mathbf{PW}$ is an $n \times n$ diagonal matrix whose $i$th diagonal element is the eigenvalue of $\mathbf{P}$ corresponding to the eigenvector represented by the $i$th column of $\mathbf{W}$. It is clear from the above that, for any specified $\mathbf{V}$ and $\beta$, the expression of $\mathbf{y}'\beta\mathbf{y}$ as a linear combination of independent noncentral chi-squares can be accomplished by determining $\mathbf{C}, \mathbf{S}$, and $\mathbf{W}$. The linear combination so obtained can be readily converted into a linear combination of independent noncentral chi-squares having distinct coefficients. It follows from the "only if" part of Theorem 1 of Baldessari  that this latter expression is unique; i.e., for the specified $\mathbf{V}, \beta$ pair, there exists no other linear combination of independent noncentral chi-squares, with distinct coefficients, having the same distribution as $\mathbf{y}'\beta\mathbf{y}$. In Sections 2-4, we study the determination of the $\mathbf{C, S}$, and $\mathbf{W}$ matrices for estimators of the variance components associated with the one-way random classification. While the techniques used here for the one-way classification can also be applied to estimators of the components associated with higher classifications, the complexity of the problem is much greater for the more-complicated models. In Section 2, results that are applicable to any quadratic form in the one-way data are given. Some special results are also presented on those quadratic estimators having a certain invariance property to be called `$\mu$-invariance.' It is shown that an estimator exhibiting this property is distributed as a linear combination of independent central chi-squares, and consequently it is not necessary to find a $\mathbf{W}$ matrix for such an estimator. In Section 3, attention is restricted to a certain subclass of quadratic estimators which includes all those in common usage, and, in Section 4, we further restrict ourselves to some cases where the subclass numbers display a very simple type of imbalance.

## Citation

David A. Harville. "Expression of Variance-Component Estimators as Linear Combinations of Independent Noncentral Chi-Square Variates." Ann. Math. Statist. 40 (6) 2189 - 2194, December, 1969. https://doi.org/10.1214/aoms/1177697297

## Information

Published: December, 1969
First available in Project Euclid: 27 April 2007

zbMATH: 0205.46201
Digital Object Identifier: 10.1214/aoms/1177697297  