Open Access
April, 1995 A Note on Admissibility When Precision is Unbounded
Charles Anderson, Nabendu Pal
Ann. Statist. 23(2): 593-597 (April, 1995). DOI: 10.1214/aos/1176324537


The estimation of a common mean vector $\theta$ given two independent normal observations $X \sim N_p(\theta, \sigma^2_x I)$ and $Y \sim N_p(\theta, \sigma^2_y I)$ is reconsidered. It being known that the estimator $\eta X + (1 - \eta)Y$ is inadmissible when $\eta \in (0, 1)$, we show that when $\eta$ is 0 or 1, then the opposite is true, that is, the estimator is admissible. The general situation is that an estimator $X^\ast$ can be improved by shrinkage when there exists a statistic $B$ which, in a certain sense, estimates a lower bound on the risk of $X^\ast$. On the other hand, an estimator is admissible under very general conditions if there is no reasonable way to detect that its risk is small.


Download Citation

Charles Anderson. Nabendu Pal. "A Note on Admissibility When Precision is Unbounded." Ann. Statist. 23 (2) 593 - 597, April, 1995.


Published: April, 1995
First available in Project Euclid: 11 April 2007

zbMATH: 0824.62007
MathSciNet: MR1332583
Digital Object Identifier: 10.1214/aos/1176324537

Primary: 62C15
Secondary: 62H12

Keywords: inadmissibility , shrinkage estimation , Stein's normal identity

Rights: Copyright © 1995 Institute of Mathematical Statistics

Vol.23 • No. 2 • April, 1995
Back to Top