## Abstract

Distribution problems in multivariate analysis are often related to the joint distribution of the characteristic roots of a matrix derived from sample observations. This well-known Fisher-Girshick-Hsu-Mood-Roy distribution (under certain null hypotheses) of $s$ non-null characteristic roots can be expressed in the form \begin{equation*}\tag{1.1}f(\theta_1, \cdots, \theta_s) = C(s, m, n) \prod^s_{i = 1} \theta^m_i (1 - \theta_i)^n \prod_{i > j} (\theta_i - \theta_j), 0 < \theta_i \leqq \cdots \leqq \theta_s < 1,\end{equation*} where \begin{equation*}\tag{1.2}C(s, m, n) = \pi^{\frac{1}{2}s^2}\Gamma_s(m + n + s + 1)/ \{\Gamma_s(\frac{1}{2}(2m + s + 1))\Gamma_s(\frac{1}{2}(2n + s + 1))\Gamma_s(\frac{1}{2}s)\}, \end{equation*} $\Gamma_s(\cdot)$ is the multivariate gamma function defined in [2], and $m$ and $n$ are defined differently for various situations described in [4], [6]. Pillai [3], [5] has given the density function of the larger of two roots, i.e. when $s = 2$, as a hypergeometric function. In this paper, the result is extended to the general case giving the density function of the largest of $s$ roots as a generalized hypergeometric function [1], [2]. The density function of the largest root of a sample covariance matrix derived by Sugiyama [7] can be obtained from the one derived here by considering the transformation $\frac{1}{2}\lambda_s = n\theta_s$, and making $n$ tend to infinity.

## Citation

K. C. Sreedharan Pillai. "On the Distribution of the Largest Root of a Matrix in Multivariate Analysis." Ann. Math. Statist. 38 (2) 616 - 617, April, 1967. https://doi.org/10.1214/aoms/1177698984

## Information