Abstract
$X$ and $\Theta$ are random variables; for given $\Theta = \theta$, the conditional distribution of $X$ is binomial with parameters $N$ and $\theta$; the first $N$ moments of $\Theta$ are known. An estimate of $\Theta$ is made based on the observed value of $X$, the risk being defined in terms of squared error loss. It is shown that as conjectured by H. Robbins, the ratio of the Bayes risk to the minimax risk for all possible distributions of $\Theta$ uniformly tends to unity when $N \rightarrow \infty$.
Citation
V. M. Joshi. "On the Minimax Estimation of a Random Probability with Known First $N$ Moments." Ann. Statist. 3 (3) 680 - 687, May, 1975. https://doi.org/10.1214/aos/1176343130
Information