Open Access
April, 1968 Minimax Estimation of a Random Probability Whose First $N$ Moments are Known
Morris Skibinsky
Ann. Math. Statist. 39(2): 492-501 (April, 1968). DOI: 10.1214/aoms/1177698412

Abstract

Let $N$ be a positive integer. In Section 2 an expository account in terms of moment space dependence is given of the Bayes estimate of a random probability $\Theta$, relative to squared difference loss, from an observable $X$ which given $\Theta$ is conditionally binomial $(N, \Theta)$. The risk and Bayes envelope functional are also considered in these terms. In Section 3 an explicit formulation is given for the minimax estimate of $\Theta$ when its first $N$ moments are known. Theorem 2 characterizes the condition that a Bayes estimate have constant risk over the class of all "priors" which yield these moments. In Section 4, a transformation is introduced which puts the interior of the space of the first $N$ moments for distributions on $\lbrack 0, 1\rbrack$ in one-one correspondence with the interior of the $N$-dimensional unit cube. This transformation is used to show that the supremum of the difference between minimax and Bayes risks over the class of all prior distributions is bounded above by $2^{-N}$. Examples for $N = 1, 2$, and 3 in terms of the above transformation are considered in Section 5.

Citation

Download Citation

Morris Skibinsky. "Minimax Estimation of a Random Probability Whose First $N$ Moments are Known." Ann. Math. Statist. 39 (2) 492 - 501, April, 1968. https://doi.org/10.1214/aoms/1177698412

Information

Published: April, 1968
First available in Project Euclid: 27 April 2007

zbMATH: 0167.47103
MathSciNet: MR221650
Digital Object Identifier: 10.1214/aoms/1177698412

Rights: Copyright © 1968 Institute of Mathematical Statistics

Vol.39 • No. 2 • April, 1968
Back to Top