Open Access
March, 1964 On the Likelihood Ratio Test of a Normal Multivariate Testing Problem
N. Giri
Ann. Math. Statist. 35(1): 181-189 (March, 1964). DOI: 10.1214/aoms/1177703740

Abstract

Let the random vector $X = (X_1 \cdots X_p)'$ have a multivariate normal distribution with unknown mean $\xi = (\xi_1 \cdots \xi_p)'$ and unknown nonsingular covariance matrix $\Sigma$. Write $\Sigma^{-1}\xi = \Gamma = (\Gamma_1 \cdots \Gamma_p)'$. The problem considered here is that of testing the hypothesis $H_0 : \Gamma_{q + 1} = \cdots = \Gamma_p = 0$ against the alternative $H_1 : \Gamma_{p' + 1} = \cdots = \Gamma_p = 0$ when $p \geqq p' > q$ and $\xi, \Sigma$ are both unknown. This problem arises in discriminating between two multivariate normal populations with the same unknown covariance matrix when one is interested to test whether the variables $X_{q + 1} \cdots X_{p'}$ contribute significantly to the discrimination. For a comprehensive treatment of this subject, the reader is referred to Rao (1952), Chapter 7. In this paper we will find the likelihood ratio test of $H_0$ against $H_1$ and show that this test is uniformly most powerful similar invariant. The problem of testing $H_0$ against $H_1$ remains invariant under the groups $G_1$ and $G_2$ where $G_1$ is the group of $p' \times p'$ non-singular matrices $g = \begin{pmatrix}g_{11} & 0\\g_{22} & g_{22}\end{pmatrix}$ (with $g_{11}$ a $q \times q$ matrix) which transform the coordinates $X_1 \cdots X_{p'}$ of $X$ and $G_2$ is the group of translations of the coordinates $X_{p' + 1} \cdots X_p$ of $X$. We may restrict our attention to the space of the sufficient statistic $(\bar X, S)$ of $(\xi, \Sigma)$. A maximal invariant under $G_1$ and $G_2$ in the space of $(\bar X, S)$ is $R = (R_1, R_2)'$, and a corresponding maximal invariant in the parametric space of $(\xi, \Sigma)$ is $\delta = (\delta_1, \delta_2)'$, where $R_i \geqq 0, \delta_i \geqq 0$ are defined in Section 2. In Section 1, we will find the likelihood ratio test of $H_0$ against $H_1$ in the usual way. The likelihood ratio test is invariant under all transformations which keep the problem invariant, and hence is a function only of $R$. In Section 2, we will find the joint density of $R_1$ and $R_2$ under the hypothesis and under the alternatives and then follow Neyman's approach of invariant similar regions to show that the likelihood ratio test in this case is uniformly most powerful similar invariant. In terms of maximal invariants, the above problem reduces to that of testing $H_0 : \delta_2 = 0, \delta_1 > 0$ against the alternative $H_1 : \delta_2 > 0, \delta_1 > 0$. According to a Fisherian philosophy of statistical inference applied to invariant procedures, it is reasonable to think of $R_1$ as giving information about the discriminating ability of the set of variables $(X_1 \cdots X_q)$, but no information about parameters governing additional discriminating ability from variables $X_{q + 1} \cdots X_{p'}$. Thus Fisher might call $R_1$ ancillary for the problem at hand and condition on it. We are not concerned here with the philosophical issues of statistical inference; instead, we will find (in Section 3) the distribution of the likelihood ratio conditional on $R_1$ which forms the basis of inference in a Fisherian approach. It will be shown that in this conditional situation, the likelihood ratio test is uniformly most powerful invariant. A more general statement of this same problem is to find the likelihood ratio test of the hypothesis $H'_0 : \Gamma \varepsilon \mathscr{Z}'$ that $\Gamma$ belongs to $\mathscr{Z}'$ against the alternative $H'_1 : \Gamma \varepsilon \mathscr{Y}'$ that $\Gamma$ belongs to $\mathscr{Y}'$, when $\xi, \Sigma$ are both unknown and $\mathscr{Z}' \subset \mathscr{Y}'$ are linear sub-spaces of the adjoint space $\mathscr{X}'$ of the space of $X$'s, and are of dimensions $q$ and $p'$ respectively. This problem can be easily reduced to that above by a proper choice of coordinate system, depending on the particular forms of $\mathscr{Z}'$ and $\mathscr{Y}'$. One could have worked with this general formulation instead of that above but the author did not find it convenient for computational purposes. As a corollary, if $q = 0$ then $H_0$ falls back to the usual null hypothesis of multivariate analysis of variance. It is easy to see that the likelihood ratio test for $q = 0$ reduces to the usual Hotelling's $T^2$ test which is uniformly most powerful invariant (Lehmann (1959)). Fisher (1938) has dealt with a particular case of the general formulation where $\mathscr{Z}'$ is the one-dimensional linear sub-space of $\mathscr{X}'$, and a test based on a discriminant function was suggested by him. The problem of testing $H_0$ against $H_1$ has been dealt with by Rao (1949) and a test depending on the ratio of Mahalanobis' $D^2$ statistics based on the first $q$ and $p'$ components of $X$ (which is related to Fisher's discriminant function in a simple manner) was suggested by him. It will be seen that both the tests are the likelihood ratio test.

Citation

Download Citation

N. Giri. "On the Likelihood Ratio Test of a Normal Multivariate Testing Problem." Ann. Math. Statist. 35 (1) 181 - 189, March, 1964. https://doi.org/10.1214/aoms/1177703740

Information

Published: March, 1964
First available in Project Euclid: 27 April 2007

zbMATH: 0129.11108
MathSciNet: MR182094
Digital Object Identifier: 10.1214/aoms/1177703740

Rights: Copyright © 1964 Institute of Mathematical Statistics

Vol.35 • No. 1 • March, 1964
Back to Top