Abstract
In this paper the problem of minimax point estimation of a function $g(\theta)$ of a parameter $\theta$ is considered, when the loss function is of the form $W(u(x), g(\theta)) = |u(x) - g(\theta)|^p, (p > 1)$ and $u(x)$ is an estimate with bounded risk. When Conditions A and B stated later hold, it is shown that a unique minimax estimate $u_0(x)$ exists and if $\{u_n(x)\}$ is any uniformly bounded minimax sequence then the risk functions of $u_n(x)$ converges uniformly to the risk function of $u_0(x)$, so that no almost subminimax estimate can exist which, though not a minimax estimate, has for a wide range of values of the parameter $\theta$, a lower value of the risk than that of the minimax estimate. Under some additional conditions, it is shown that an approximation to the minimax estimate $u_0(x)$ in the space $\mathscr{F}^{(p)}_\infty$ of functions with bounded risk, may be obtained by the minimax estimate $\bar{u}_N(x)$ in the finite dimensional linear space spanned by $N$ basis vectors $v_1, \cdots v_N$ of $\mathscr{F}^{(p)}_\infty$, so that the maximum risk of $\bar{u}_N(x)$ converges to that of $u_0(x)$. This may help in finding an approximation to a minimax estimate in non-standard problems, where it is difficult to guess a minimax estimate from invariance or other considerations and specially when the problem is a perturbation of a standard problem.
Citation
M. N. Ghosh. "Uniform Approximation of Minimax Point Estimates." Ann. Math. Statist. 35 (3) 1031 - 1047, September, 1964. https://doi.org/10.1214/aoms/1177703262
Information