Open Access
December, 1965 Multivariate-Normal Classification with Covariances Known
Bob E. Ellison
Ann. Math. Statist. 36(6): 1787-1793 (December, 1965). DOI: 10.1214/aoms/1177699807

Abstract

The admissibility of the minimum distance rule (= maximum likelihood rule) and of a restricted maximum likelihood rule are proved in [5] for a zero-one loss function in a classification problem in which information about the means of the $k$ alternative multivariate normal distributions is based on samples. That classification problem is a special case of the problem of deciding in which of $k$ given linear manifolds the mean of a normally distributed vector lies when the covariance matrix is known. The admissibility of the minimum distance rule (= maximum likelihood rule) in the more general problem is proved in [3] and [4]. The proof is similar to that given in [5] for the special case; the choice of the prior distribution used in the proof is dictated by Lemmas 2 and 4 of [5]. The purpose of this paper is to present the admissibility proof for the more general problem. The more general problem includes classification problems in which information about the means of the $k$ alternative multivariate normal distributions is based on samples, and the means are linearly restricted. The admissibility of classification rules in such problems has received attention recently (see, e.g., the abstracts by Das Gupta, [2], and Srivastava, [7]). A problem more general yet than that treated in this paper is the problem of deciding in which of $k$ given linear manifolds the mean of a normally distributed vector lies when the covariance matrix is a possibly different known matrix for each of the $k$ alternatives. A parametric family of admissible classification rules for that problem can be obtained by simply replacing $\not\Sigma$ by $\not\Sigma_j$, the known covariance matrix for the $j$th alternative, in the statistic $t_j(x \mid h)$ given by Equation (4) of Section 5, $j = 1, \cdots, k$. However, such a family of admissible classification rules is of little interest per se, since other such families are easily generated as Bayes procedures relative to parametric families of prior distributions. What is of considerably more interest is the question of whether or not certain "natural" rules are admissible. The maximum likelihood rule, which is not identical with the minimum distance rule in this problem, is a "natural" rule. The maximum likelihood rule is not contained in the family of admissible rules obtained by the replacement of $\not\Sigma$ in (4), and whether or not it is in general admissible in this problem is not known to me. If the covariance matrix is unknown, but an independent estimate of it is available, it is "natural" to use the estimate in place of the true covariance matrix in the minimum distance rule (= maximum likelihood rule). Whether or not the "natural" rule is in general admissible in this problem is not known. The problem considered in this paper is stated in Section 2. In Section 3 the minimum distance and maximum likelihood rules for the problem are defined; the rules are seen to be equivalent. In Section 4 the problem is reparametrized, and Bayes procedures relative to prior distributions of the new parameters are obtained in general. The minimum distance rule is obtained as a Bayes procedure in Section 5, and its admissibility is deduced. Examples of applications are given in Section 6.

Citation

Download Citation

Bob E. Ellison. "Multivariate-Normal Classification with Covariances Known." Ann. Math. Statist. 36 (6) 1787 - 1793, December, 1965. https://doi.org/10.1214/aoms/1177699807

Information

Published: December, 1965
First available in Project Euclid: 27 April 2007

zbMATH: 0144.40804
MathSciNet: MR185751
Digital Object Identifier: 10.1214/aoms/1177699807

Rights: Copyright © 1965 Institute of Mathematical Statistics

Vol.36 • No. 6 • December, 1965
Back to Top