## The Annals of Mathematical Statistics

### On the Likelihood Ratio Test of a Normal Multivariate Testing Problem II

N. Giri

#### Abstract

Let the random vector $X = (X_1 \cdots X_p)'$ have a multivariate normal distribution with unknown mean $\xi$ and unknown nonsingular covariance matrix $\Sigma$. Write $\bar\Gamma = \Sigma^{-1}\xi = (\Gamma_1, \Gamma_2, \Gamma_3)'$, where $\Gamma_1, \Gamma_2$ and $\Gamma_3$ are subvectors of $\bar\Gamma$ containing first $q$, next $p' - q$ and last $p - p'$ components of $\bar\Gamma$ respectively. We will consider here, the problem of testing the hypothesis $H_0 : \Gamma_3 = \Gamma_2 = 0$ against the alternative $H_1 : \Gamma_2 = 0, \Gamma_2 \neq 0$ when $p > p' > q$ and $\xi, \Sigma$ are both unknown. The origin of the problem and its likelihood ratio test have been discussed in Giri (1964a, 1964b). It has also been shown there that for $p = p' > q$, the likelihood ratio test of $H_0$ against $H_1$ is uniformly most powerful invariant similar. In this paper we will prove that the likelihood ratio test of $H_0$ against $H_1$ is uniformly most powerful invariant similar for the general case, i.e. $p > p' > q$. A corollary that the likelihood ratio test is uniformly most powerful similar among the group of tests with power depending only on $\Delta(H_1)$ (defined in Section 1) will follow from this. The problem of testing $H_0$ against $H_1$ remains invariant under the group $G$ of $p \times p$ nonsingular matrices $g = \begin{pmatrix}g_{11} & 0 & 0 \\ g_{12} & g_{22} & 0 \\ g_{13} & g_{23} & g_{33}\end{pmatrix}$ operating as $(X, \xi, \Sigma) \rightarrow (gX, g\xi, g\Sigma g')$, where $g_{11}, g_{22}$ and $g_{33}$ are $q \times q, (p' - q) \times (p' - q)$ and $(p - p') \times (p - p')$ submatrices of $g$ respectively. We may restrict our attention to the space of minimal sufficient statistic $(\bar X, S)$ of $(\xi, \Sigma)$. A maximal invariant in $(\bar X, S)$ under $G$ is $R = (R_1, R_2, R_3)'$ and a corresponding maximal invariant in $(\xi, \Sigma)$ under $G$ is $\Delta = (\delta_1, \delta_2, \delta_3)'$ where $R_i \geqq 0, \delta_i \geqq 0$ are defined in Section 1. In terms of maximal invariant the above problem reduces to that of testing $H_0 :\delta_3 = \delta_2 = 0, \delta_1 > 0$ against $H_1 :\delta_3 = 0, \delta_2 > 0, \delta_1 > 0$ when $\xi, \Sigma$ are both unknown. It has been shown in Giri (1964a) that on the basis of $N$ random observations the likelihood ratio test of $H_0$ against $H_1$ is given by reject $H_0$, if $Z = (1 - R_1 - R_2)/(1 - R_1) \leqq Z_0$, where the constant $Z_0$ is determined in such a way that the test has size $\alpha$ and under $H_0, Z$ has beta-distribution with parameters $(N - p')/2, (p' - q)/2$. In Section 1 we will find the maximal invariants $R$ and $\Delta$ along with the distribution of $R$. Actually, we will first find here the maximal invariant in $(\bar X, S)$ under a more general group $G_k$ (to be defined in Section 1) and its distribution. The maximal invariant $R$ under $G$ and its distribution will follow from this as a special case. In Section 2 we will prove the theorem that the likelihood ratio test is uniformly most powerful invariant similar.

#### Article information

Source
Ann. Math. Statist., Volume 36, Number 3 (1965), 1061-1065.

Dates
First available in Project Euclid: 27 April 2007

https://projecteuclid.org/euclid.aoms/1177700083

Digital Object Identifier
doi:10.1214/aoms/1177700083

Mathematical Reviews number (MathSciNet)
MR192590

Zentralblatt MATH identifier
0203.21202

JSTOR