Open Access
December, 1958 Step-Down Procedure in Multivariate Analysis
J. Roy
Ann. Math. Statist. 29(4): 1177-1187 (December, 1958). DOI: 10.1214/aoms/1177706449

Abstract

Test criteria for (i) multivariate analysis of variance, (ii) comparison of variance-covariance matrices, and (iii) multiple independence of groups of variates when the parent population is multivariate normal are usually derived either from the likelihood-ratio principle [6] or from the "union-intersection" principle [2]. An alternative procedure, called the "step-down" procedure, has been recently used by Roy and Bargmann [5] in devising a test for problem (iii). In this paper the step-down procedure is applied to problems (i) and (ii) in deriving new tests of significance and simultaneous confidence-bounds on a number of "deviation-parameters." The essential point of the step-down procedure in multivariate analysis is that the variates are supposed to be arranged in descending order of importance. The hypothesis concerning the multivariate distribution is then decomposed into a number of hypotheses--the first hypothesis concerning the marginal univariate distribution of the first variate, the second hypothesis concerning the conditional univariate distribution of the second variate given the first variate, the third hypothesis concerning the conditional univariate distribution of the third variate given the first two variates, and so on. For each of these component hypotheses concerning univariate distributions, well known test procedures with good properties are usually available, and these are made use of in testing the compound hypothesis on the multivariate distribution. The compound hypothesis is accepted if and only if each of the univariate hypotheses are accepted. It so turns out that the component univariate tests are independent, if the compound hypothesis is true. It is therefore possible to determine the level of significance of the compound test in terms of the levels of significance of the component univariate tests and to derive simultaneous confidence-bounds on certain meaningful parametric functions on the lines of [3] and [4]. The step-down procedure obviously is not invariant under a permutation of the variates and should be used only when the variates can be arranged on a priori grounds. Some advantages of the step-down procedure are (i) the procedure uses widely known statistics like the variance-ratio, (ii) the test is carried out in successive stages and if significance is established at a certain stage, one can stop at that stage and no further computations are needed, and (iii) it leads to simultaneous confidence-bounds on certain meaningful parametric functions. 1.1 Notations. The operator $\varepsilon$ applied to a matrix of random variables is used to generate the matrix of expected values of the corresponding random variables. The form of a matrix is denoted by a subscript; thus $A_{n \times m}$ indicates that the matrix $A$ has $n$ rows and $m$ columns. The maximum latent root of a square matrix $B$ is denoted by $\lambda_{\max}(B)$. Given a vector $a = (a_1, a_2, \cdots, a_t)'$ and a subset $T$ of the natural numbers $1, 2, \cdots, t$, say $T = (j_1, j_2, \cdots, j_u)$ where $j_1 < j_2 < \cdots j_u$, the notation $T\lbrack a\rbrack$ will be used to denote the positive quantity: $$T\lbrack a\rbrack = + \{a^2_{j_1} + a^2_{j_2} + \cdots + a^2_{j_t}\}^{1/2}.$$ $T\lbrack a\rbrack$ will be called the $T$-norm of $a$. Similarly, given a matrix $B_{t \times t}$, we shall write $B_{(T)}$ for the $u \times u$ submatrix formed by taking the $j_1$th, $j_2$th, $\cdots, j_u$th rows and columns of $B$. We shall call $B_{(T)}$ the $T$-submatrix of $B$.

Citation

Download Citation

J. Roy. "Step-Down Procedure in Multivariate Analysis." Ann. Math. Statist. 29 (4) 1177 - 1187, December, 1958. https://doi.org/10.1214/aoms/1177706449

Information

Published: December, 1958
First available in Project Euclid: 27 April 2007

zbMATH: 0087.33907
MathSciNet: MR100938
Digital Object Identifier: 10.1214/aoms/1177706449

Rights: Copyright © 1958 Institute of Mathematical Statistics

Vol.29 • No. 4 • December, 1958
Back to Top