Open Access
June, 1959 An Extension of the Cramer-Rao Inequality
John J. Gart
Ann. Math. Statist. 30(2): 367-380 (June, 1959). DOI: 10.1214/aoms/1177706257

Abstract

Cramer ([6], p. 474 ff.), Darmois [8], Frechet [10], and Rao [14] derived independently a lower bound for the mean square error of an estimate $t$ of a parameter which appears in a frequency function of a specified form. This expression, alternately termed the Cramer-Rao inequality or the information limit, is \begin{equation*}\tag{1.1}]E(t - \alpha)^2 \geqq \lbrack E(t) - \alpha\rbrack^2 + \frac{\big\lbrack\frac{\partial E(t)}{\partial\alpha}\big\rbrack^2}{E\big(\frac{\partial\ln\phi} {\partial\alpha}\big)^2}\end{equation*} where $\phi$ is the likelihood of the sample. The expression $E(\partial \ln \phi/\partial\alpha)^2$ is called the information on $\alpha$ and is sometimes denoted by $I(\alpha)$. Under rather general conditions it can be shown equal to $E( - \partial^2 \ln \phi/\partial\alpha^2)$. The equality in (1.1) is reached if and only if, \begin{equation*}\tag{1.2}\phi = \phi_1e^{tV(\alpha) + W(\alpha)}\end{equation*} where $t$ and $\phi_1$ are functions of the observations alone and $V(\alpha)$ and $W(\alpha)$ are functions of $\alpha$ alone. By the results of Pitman [13] and Koopman [12], the form of (1.2) implies that $t$ must be a sufficient statistic. The fact that this form of the likelihood yields a minimum variance estimate was first pointed out by Aitken and Silverstone [1]. If we have $n$ observations which are independently and identically distributed, the frequency function of the underlying population must be of the so-called Pitman-Koopman form, \begin{equation*}\tag{1.3}f(x; \alpha) = (\alpha)h(x)e^{P(\alpha)g(x)}\end{equation*} and $t$ must be a function of $\sum^n_{i = 1} g(x_i)$ for the equality in (1.2) to hold. Several extensions of the basic inequality have been derived. Bhattacharyya [4] and Chapman and Robbins [5] have derived results which yield more stringent inequalities in certain instances. Wolfowitz [21] has extended the result to sequential sampling situations. Cramer [7], Darmois [8], and Barankin [2] have considered joint bounds on sets of estimates of parameters and Hammersley [11] has derived a lower bound of the mean square error of an estimate for the situation in which the parameter to be estimated can only assume discrete values. Barankin [3] has also considered lower bounds on the general absolute central moments of the estimate. All these results assume that the parameters involved are constants. Here we shall consider the case where the parameters are random variables. Thus the lower bound of the mean square error of an estimate will take into account the variability due to both the observations and the parameters involved. Necessary and sufficient conditions for equality of the extended inequality are derived. Most unfavorable distributions, i.e., distributions which maximize the lower bound, are defined, and several examples are given. Extensions analogous to those of Bhattacharyya [4] and Wolfowitz [21] are also considered. Finally, bounds on the variance of linear estimates of the mean of the parameter are derived.

Citation

Download Citation

John J. Gart. "An Extension of the Cramer-Rao Inequality." Ann. Math. Statist. 30 (2) 367 - 380, June, 1959. https://doi.org/10.1214/aoms/1177706257

Information

Published: June, 1959
First available in Project Euclid: 27 April 2007

zbMATH: 0093.15804
MathSciNet: MR106524
Digital Object Identifier: 10.1214/aoms/1177706257

Rights: Copyright © 1959 Institute of Mathematical Statistics

Vol.30 • No. 2 • June, 1959
Back to Top