## The Annals of Mathematical Statistics

### A Note on the Sphericity Test

Leon J. Gleser

#### Abstract

Let $x$ be a random $p \times 1$ column vector having a multivariate normal distribution with unknown mean vector $\mu$ and unknown covariance matrix $\Sigma$. We wish to test the hypothesis of "sphericity," namely $H:\Sigma = \sigma^2I_p$, where $\sigma^2 > 0$ is an unknown positive constant. Alternatives to $H$ which are considered are $H_A : \Sigma$ positive definite, but $\Sigma \neq \sigma^2I$. Given $N$ observation vectors $x^{(1)}, x^{(2)}, \cdots, x^{(N)}$, independently distributed, each with the distribution of $x$, we can reduce consideration to the sufficient statistic $(\bar x, S)$, where $\bar x = N^{-1} \sum^N_{i = 1} x^{(i)},\quad S = \sum^N_{i = 1} (x^{(i)} - \bar x)(x^{(i)} - \bar x)'.$ Then $\bar x$ has a multivariate normal distribution with mean vector $\mu$ and covariance matrix $\Sigma/N$, and $S$ has the Wishart distribution, i.e., has density \begin{equation*}\tag{1.1}p(S) = C_{p,n} |S|^{(n - p - 1)/2} |\Sigma|^{-n/2} \exp \lbrack -\frac{1}{2} \mathrm{tr} \Sigma^{-1}S\rbrack,\quad S > 0\end{equation*} where $C^{-1}_{p,n} = \pi^{p(p - 1)/4}2^{np/2} \prod^p_{i = 1} \Gamma((n - i + 1)/2),\quad p \leqq n,$ and $n = N - 1$. Henceforth we shall denote the fact that a random matrix $Z$ has the density (1.1) by writing $\mathfrak{L}(Z) = \mathfrak{W}(\Sigma, p, n)$; thus, $\mathfrak{L}(S) = \mathfrak{W}(\Sigma, p, n)$. Mauchly [4] has found the likelihood ratio test for $H$ v.s. $H_A$. The rejection region of this test can be written in the form: \begin{equation*}\tag{1.2}T(S) \equiv (\mathrm{tr} S)^p/|S| > K,\end{equation*} where $T(S)/p^p$ is the $-2/N$th power of the likelihood ratio statistic $\lambda$. The moments of the likelihood ratio statistic $\lambda$ under $H$ were obtained by Mauchly [4]. Anderson [1] uses these moments to give the exact distribution of $\lambda$ under $H$ and to obtain an asymptotic expansion of this null distribution. The distribution of $\lambda$ under $H_A$ has been obtained for the case $p = 2$ by Girshick [3], but the distribution of $\lambda$ under $H_A$ for $p > 2$ appears to be highly untractable. In this note, we show that the distribution of $T(S)$ is related to the distribution of Bartlett's statistic for testing homogeneity of variances (viz., Anderson [1]). From this relation, we derive that Mauchly's test (1.2) is unbiased. A derivation of the asymptotic distribution of $T(S)$ under $H_A$ completes the note. It should be mentioned here that a direct relationship between the likelihood ratio statistic $\lambda$ and the Bartlett statistic for testing the homogeneity of variances for the elements of $x$ is given by Anderson [1]. He shows that $H:\Sigma = \sigma^2I$ is a combination of two hypotheses $H_1:\Sigma$ is diagonal, and $H_2:\Sigma = \sigma^2I$ given that $\Sigma$ is diagonal. Hypothesis $H_2$ is the hypothesis of the homogeneity of the variances of the elements of the vector $x$ given that these random elements are stochastically independent. The likelihood ratio statistic $\lambda_2$ for testing this hypothesis is a monotone function of Bartlett's statistic. Further, the likelihood ratio statistic $\lambda$ for $H$ is the product $\lambda = \lambda_1\lambda_2$ of $\lambda_2$ and the likelihood ratio statistic $\lambda_1$ for testing $H_1$ (Anderson [1], pp. 260-2). Unfortunately, both the distribution of $\lambda_1$ and the distribution of $\lambda_2$ depend upon the unknown $\Sigma$, and, unless $\Sigma$ is diagonal, $\lambda_1$ and $\lambda_2$ are dependent. As a result, this relationship between $\lambda$ and Bartlett's statistic is difficult to exploit in finding the distribution of $\lambda$. In this note, we use the invariance of $\lambda$ under orthogonal transformations to enable us to change to new variables having a diagonal covariance matrix. Homogeneity of variances for these new variables is shown to be equivalent to $H:\Sigma = \sigma^2I$ under the old variables. Using Anderson's representation for $H$, but now expressed in terms of the new variables, we have $\lambda = \lambda_1'\lambda_2'$, where $\lambda_2'$ tests homogeneity of variances for the new variables and $\lambda_1'$ tests the diagonality of the new covariance matrix. Since the new covariance matrix is diagonal, $\lambda_1'$ and $\lambda_2'$ are independent and $\lambda_1'$ has a distribution independent of the parameters. Such a representation is (hopefully) convenient for determining the properties of the likelihood ratio test based on $\lambda$. This representation, however, only connects the distribution of $\lambda$ and the distribution of Bartlett's statistic, for the "new" variables used in our representation are not observable, but rather are functions of the unknown covariance matrix $\Sigma$.

#### Article information

Source
Ann. Math. Statist., Volume 37, Number 2 (1966), 464-467.

Dates
First available in Project Euclid: 27 April 2007

https://projecteuclid.org/euclid.aoms/1177699529

Digital Object Identifier
doi:10.1214/aoms/1177699529

Mathematical Reviews number (MathSciNet)
MR187329

Zentralblatt MATH identifier
0138.13901

JSTOR