Abstract
Let $F(P)$ be a real valued function defined on a subset $\mathscr{D}$ of the set $\mathscr{D}^\ast$ of all probability distributions on the real line. A function $f$ of $n$ real variables is an unbiased estimate of $F$ if for every system, $X_1, \cdots, X_n$, of independent random variables with the common distribution $P$, the expectation of $f(X_1 \cdots, X_n)$ exists and equals $F(P)$, for all $P$ in $\mathscr{D}$. A necessary and sufficient condition for the existence of an unbiased estimate is given (Theorem 1), and the way in which this condition applies to the moments of a distribution is described (Theorem 2). Under the assumptions that this condition is satisfied and that $\mathscr{D}$ contains all purely discontinuous distributions it is shown that there is a unique symmetric unbiased estimate (Theorem 3); the most general (non symmetric) unbiased estimates are described (Theorem 4); and it is proved that among them the symmetric one is best in the sense of having the least variance (Theorem 5). Thus the classical estimates of the mean and the variance are justified from a new point of view, and also, from the theory, computable estimates of all higher moments are easily derived. It is interesting to note that for $n$ greater than 3 neither the sample $n$th moment about the sample mean nor any constant multiple thereof is an unbiased estimate of the $n$th moment about the mean. Attention is called to a paradoxical situation arising in estimating such non linear functions as the square of the first moment.
Citation
Paul R. Halmos. "The Theory of Unbiased Estimation." Ann. Math. Statist. 17 (1) 34 - 43, March, 1946. https://doi.org/10.1214/aoms/1177731020
Information