Open Access
December, 1961 The Use of Least Favorable Distributions in Testing Composite Hypotheses
H. E. Reinhardt
Ann. Math. Statist. 32(4): 1034-1041 (December, 1961). DOI: 10.1214/aoms/1177704843


The usual method of finding a most powerful size $\alpha$ test of a composite hypothesis against a simple alternative is the guessing of a Least Favorable Distribution (LFD)--introduced at various levels of generality by Neyman and Pearson [6], Wald [7], Lehmann [4], and Lehmann and Stein [5]--and testing the mixture of the distributions of the hypothesis over this LFD against the alternate using the Neyman-Pearson Fundamental Lemma. In guessing LFD's statisticians have looked for a mixture which is "like" the alternate. In this paper, the notion of Uniformly Least Favorable Mixture (ULFM) is introduced. In Section 2, we show that a ULFM is a point in the convex set of mixtures of the hypothesis which is closest (in the sense of the $\mathfrak{L}^1$ norm) to the alternate. The condition is not sufficient. More generally, any LFM corresponds to a point which is closest to the alternate in some expansion or contraction of this set of mixtures. A sufficient condition for ULFM's is, essentially, that the nuisance parameter can take on the same values in the alternate as in the hypothesis. In Section 3, we consider the case where no ULFM exists. We show, inter alia, that any distribution is least favorable for a closed set of $\alpha$'s. (A pathological example shows that this closed set need not be the union of a finite number of closed intervals.)


Download Citation

H. E. Reinhardt. "The Use of Least Favorable Distributions in Testing Composite Hypotheses." Ann. Math. Statist. 32 (4) 1034 - 1041, December, 1961.


Published: December, 1961
First available in Project Euclid: 27 April 2007

zbMATH: 0212.21702
MathSciNet: MR143283
Digital Object Identifier: 10.1214/aoms/1177704843

Rights: Copyright © 1961 Institute of Mathematical Statistics

Vol.32 • No. 4 • December, 1961
Back to Top