Open Access
September, 1951 Note on Uniformly Best Unbiased Estimates
R. C. Davis
Ann. Math. Statist. 22(3): 440-445 (September, 1951). DOI: 10.1214/aoms/1177729591

Abstract

Bhattacharyya [1] has considered recently the following problem in statistical estimation. Let $X_1,X_2,\cdots,X_n$ be $n$ stochastic variables distributed according to the probability law $f(x_1,x_2,\cdots,x_n; \theta) dx_1dx_2\cdots dx_n$, where $\theta$ is the unknown parameter. Consider the class of all functions $T(x_1,x_2,\cdots,x_n)$ of the stochastic variables such that the expectation of each function in this class is equal to a preassigned function $\tau(\theta)$. Usually $\tau(\theta)$ admits of more than one unbiased estimate, and the problem posed by various authors is to obtain a lower bound of the variances of all such estimates, this lower bound to be independent of the estimates themselves but depending on $\tau(\theta)$ and the distribution function of the $n$ stochastic variables. Under certain regularity conditions Bhattacharyya obtained a lower bound of the above type which is never lower than the one obtained earlier independently by Cramer [2] and Rao [3], although the conditions assumed by Bhattacharyya are more restrictive than those assumed by the latter authors. Recently E. W. Barankin in a remarkable paper [4] has developed a procedure which yields the class of lower bounds of unbiased estimates having minimum $s$th absolute central moment $(s > 1)$ at a preassigned parameter value $\theta_0$. In this note we are concerned with the attainment of a lower bound obtained first by Bhattacharyya. Bhattacharyya discusses the case in which his lower bound is attained and derives some interesting properties of the distribution of such a statistic (which might be called a generalized efficient statistic). The purpose of this note is to prove that in the case in which the variables $X_1,X_2,\cdots,X_n$ are independently and identically distributed with a common distribution function $F(x; \theta)$ depending upon a single unknown parameter, one obtains the following result: under the regularity conditions assumed by Bhattacharyya in which the parameter $\theta$ may assume values in an interval of the real axis, and with an additional slight restriction on the cumulative distribution function $F(x; \theta)$, no generalized efficient statistic exists which is constructed by use of both the first and second derivatives of the likelihood function with respect to the parameter. It follows that if an efficient estimate (in the sense originally defined by Fisher [5]) for the single unknown parameter does not exist, then no distribution $F(x; \theta)$ exists possessing a uniformly minimum variance unbiased estimate of $\tau(\theta)$ which is constructed by using a linear combination of the first and second partial logarithmic derivatives of the likelihood function. This result for the case involving a single unknown parameter is peculiarly of interest in view of the fact that Seth [6] has given an example in which the above construction is possible if the distribution involves two unknown parameters.

Citation

Download Citation

R. C. Davis. "Note on Uniformly Best Unbiased Estimates." Ann. Math. Statist. 22 (3) 440 - 445, September, 1951. https://doi.org/10.1214/aoms/1177729591

Information

Published: September, 1951
First available in Project Euclid: 28 April 2007

zbMATH: 0054.06106
MathSciNet: MR43414
Digital Object Identifier: 10.1214/aoms/1177729591

Rights: Copyright © 1951 Institute of Mathematical Statistics

Vol.22 • No. 3 • September, 1951
Back to Top