Abstract
The estimation of a common mean vector $\theta$ given two independent normal observations $X \sim N_p(\theta, \sigma^2_x I)$ and $Y \sim N_p(\theta, \sigma^2_y I)$ is reconsidered. It being known that the estimator $\eta X + (1 - \eta)Y$ is inadmissible when $\eta \in (0, 1)$, we show that when $\eta$ is 0 or 1, then the opposite is true, that is, the estimator is admissible. The general situation is that an estimator $X^\ast$ can be improved by shrinkage when there exists a statistic $B$ which, in a certain sense, estimates a lower bound on the risk of $X^\ast$. On the other hand, an estimator is admissible under very general conditions if there is no reasonable way to detect that its risk is small.
Citation
Charles Anderson. Nabendu Pal. "A Note on Admissibility When Precision is Unbounded." Ann. Statist. 23 (2) 593 - 597, April, 1995. https://doi.org/10.1214/aos/1176324537
Information