## Abstract

It is well known, in testing an hypothesis concerning the means of several independent normal distributions with common but unknown variance $\sigma^2$, that the likelihood ratio $\lambda$, raised to an appropriate positive power, is equal to the ratio of two quadratic forms. That is, there exists a positive constant $c$ so that $\lambda^c = \mathbf{X'AX/X'BX}$, where $\mathbf{A}$ and $\mathbf{B}$ are real symmetric matrices and $\mathbf{X}$ is a column matrix whose elements have independent normal distributions. Since $\lambda \leqq 1$, we see that $\mathbf{X' (B - A)X}$ is a non-negative quadratic form. In addition, we usually find that $\mathbf{X'AX}/\sigma^2$ and $\mathbf{X'BX}/\sigma^2$ have chi-square distributions. That is, $\mathbf{A}^2 = \mathbf{A}$ and $\mathbf{B}^2 = \mathbf{B}$. Consequently, in accordance with a theorem of Hogg and Craig [5], $\mathbf{A(B - A)} = \mathbf{0}$ and $(\mathbf{B - A})^2 = (\mathbf{B - A)}$. Hence $\mathbf{X'AX}$ and $\mathbf{X'(B - A)X}$ are stochastically independent and $\mathbf{X'(B - A)X}/\sigma^2$ is chi-square. Hence $\lambda^c$ has a beta distribution provided each chi-square is central; this is usually the case under the null hypothesis. The analogous situation in multivariate statistical analysis introduces a rather intriguing theorem which almost seems obvious upon first inspection. We will describe this situation after we present some notation and certain preliminary results. Let the $(n \times p)$ matrix $\mathbf{X} = (x_{ij})$ have the p.d.f. $\exp\{-\frac{1}{2}\operatorname{tr}\lbrack \mathbf{K}^{-1}(\mathbf X - \mathcal{u})'\mathbf{V}^{-1} (\mathbf X - \mathcal{u})\rbrack\ /(2\pi)^{np/2} |\mathbf{K}|^{n/2} |\mathbf{V}|^{p/2}, - \infty < x_{ij} < \infty,$ where the $(n \times n)$ matrix $\mathbf{V}$ and the $(p \times p)$ matrix $\mathbf{K}$ are real symmetric positive definite matrices and $\mathbf{\mathcal{u}}$ is a real $(n \times p)$ matrix. If $\mathbf{V = I}$, the $(p \times 1)$ column matrices $\mathbf{Y}_1, \cdots, \mathbf{Y}_n$ of $\mathbf{X'}$ are independent and have, respectively, the $p$-variate normal distributions $N(\mathcal{\mathbf{u}}_k, \mathbf{K}), k = 1, 2, \cdots, n$, where $\mathbf{\mathcal{u}}_1, \cdots,\mathbf{\mathcal{u}}_n$ are the $(p \times 1)$ column matrices of $\mathbf{\mathcal{u}}'$. In a recent article [7], Roy and Gnanadesikan proved the following results which are generalizations of two theorems on quadratic forms [3], [4]. Let $\mathbf{A, A}_1$, and $\mathbf{A}_2$ be real symmetric $(n \times n)$ matrices. Then $\mathbf{X'AX}$ has the Wishart distribution $W(\mathbf{K}, r, \mathbf{\mathcal{u}}'A\mathbf{\mathcal{u}})$ if and only if $\mathbf{AVA} = \mathbf{A}$ (or $\mathbf{A}^2 = \mathbf{A}$ provided $\mathbf{V = I}$). Here $r$ is the rank of $\mathbf{A}$ and $\mathbf{\mathcal{u}}'A\mathcal{u}$ is the matrix of the non-centrality parameters. The forms $\mathbf{X'A}_1\mathbf{X}$ and $\mathbf{X'A}_2\mathbf{X}$ are stochastically independent if and only if $\mathbf{A}_1\mathbf{VA}_2 = 0$ (or $\mathbf{A}_1\mathbf{A}_2 = 0$ provided $\mathbf{V = I}$). These results permit us to state, without proof, the chi-square decomposition theorem of Hogg and Craig [5] in terms of Wishart variables. THEOREM 1. Let $\mathbf{A, A}_1, \cdots, \mathbf{A}_{k-1}, \mathbf{A}_k$ be real symmetric $(n \times n)$ matrices so that $\mathbf{A = A}_1 + \cdots + \mathbf{A}_{k-1} + \mathbf{A}_k$. Let $\mathbf{X'AX, X'A}_1\mathbf{X}, \cdots, \mathbf{X'A}_{k-1}\mathbf{X}$ have Wishart distributions and let $\mathbf{A}_k$ be positive semidefinite. Then $\mathbf{X'A}_1\mathbf{X}, \cdots, \mathbf{X'A}_{k-1}\mathbf{X, X'A}_k\mathbf{X}$ are mutually stochastically independent and $\mathbf{X'A}_k\mathbf{X}$ has a Wishart distribution. Moreover, if $\mathbf{A} = \mathbf{V}^{-1}$ and if $k = 2$, the above conclusion is still valid even though the hypothesis that $\mathbf{A}_2$ is positive semidefinite is omitted. Now, in most tests of an hypothesis concerning the means of a multivariate normal distribution with unknown matrix $\mathbf{K}$, the likelihood ratio, raised to an appropriate positive power, is equal to the ratio of two determinants, say $U = |\mathbf{X'AX}| / |\mathbf{X'BX}|$, where $\mathbf{A}$ and $\mathbf{B}$ are real symmetric matrices with ranks greater than or equal to $p$. Usually $\mathbf{V = I}$ and hence, for simplicity, we assume that this is the case. In addition, we frequently know, or it can be easily shown, that both $\mathbf{X'AX}$ and $\mathbf{X'BX}$ have Wishart distributions. Accordingly, if the fact that the likelihood ratio is less than or equal to one (or $|mathbf{X'AX}| \leqq| \mathbf{X'BX}|$ for all $\mathbf{X}$) implies that $\mathbf{B - A}$ is positive semidefinite, then Theorem 1 requires that $\mathbf{X'AX}$ and $\mathbf{X'(B - A)X}$ be stochastically independent and that $\mathbf{X'(B - A)X}$ have a Wishart distribution. Thus, if this is true, $U$ has a well known distribution, for it is distributed like $|\mathbf{W}_1|/|\mathbf{W}_1 + \mathbf{W}_2|$, where $\mathbf{W}_1$ and $\mathbf{W}_2$ are independent Wishart variables. In the next section, we show that this is, in fact, the situation. Finally, in the last section of this paper, we consider certain other theorems on independence that involve Wishart variables.

## Citation

Robert V. Hogg. "On the Independence of Certain Wishart Variables." Ann. Math. Statist. 34 (3) 935 - 939, September, 1963. https://doi.org/10.1214/aoms/1177704015

## Information