Abstract
Let $B$ be the $\sigma$-algebra of all Borel subsets of the real line $\mathscr{X}$ and $\mathscr{A}$ be a $\sigma$-algebra of subsets of a set $\mathscr{Y}$. Let $\nu$ be a probability measure on $\mathscr{A}$. Let $P$ be a $B \times \mathscr{A}$ measurable function on $\mathscr{X} \times \mathscr{Y}$ such that $P(\cdot, y)$ is a distribution function for each $y \varepsilon \mathscr{Y}$. As usual, $B \times \mathscr{A}$ is the $\sigma$-algebra generated by sets $C \times A$ for $C \varepsilon B$ and $A \varepsilon \mathscr{A}$. We observe $(X, Y)$ where $\nu$ is the marginal distribution of $Y$ and, for some unknown $\theta$, the conditional distribution function of $X - \theta$ given $Y$ is $P(\cdot, Y)$. It is desired to estimate $\theta$ with loss function \begin{equation*}\tag{1.1}L(\theta, d)\begin{align*} = a(\theta - d) \quad\text{if} d \leqq \theta\\ = b(d - \theta)\quad\text{if} d \geqq \theta.\end{align*}\end{equation*} For any statistical problem, let $\rho(\theta, \delta)$ be the risk when $\delta$ is the decision procedure used and $\theta$ is the value of the parameter. In all that follows, $\mu$ will be Lebesgue measure on the real line. In a problem involving a real parameter, we say that $\delta$ is almost admissible provided that, given any other procedure $\delta'$, if $\rho(\theta, \delta') \leqq \rho(\theta, \delta)$ for all $\theta$, then $\rho(\theta, \delta') = \rho(\theta, \delta)$ a.e. ($\mu$). The purpose of the present paper is to prove the THEOREM. Suppose that, in addition to the above assumptions, (i) for each $y \varepsilon \mathscr{Y}$, the unique $(1 - \alpha)$th quantile of $P(\cdot, y)$ is 0 where $\alpha = a/(a + b)$ and \begin{equation*}\tag{(ii)}\int dv(y) \int x^2 d_xP(x, y) < \infty.\end{equation*} Then, under the loss function given in (1.1), $X$ is an almost admissible estimate of $\theta\cdot$ The proof of the theorem will be given in Section 2. Section 3 contains the proof of the COROLLARY. If, in addition to the assumptions of the theorem, $P(\cdot, y)$ is either absolutely continuous for all $y \varepsilon \mathscr{Y}$ or has its points of increase in a fixed lattice for all $y \varepsilon \mathscr{Y}$, then $X$ is an admissible estimate of $\theta$. If $\theta$ has "uniform" a priori distribution on $\mathscr{X}$, then $X$ is the $\alpha$th quantile of the a posteriori distribution of $\theta$ given $X$. Hence, $X$ is the best invariant estimate of $\theta$. Farrell [3] has shown that when the unicity assumption in the theorem is violated $X$ cannot be almost admissible. The Theorem and Corollary are analogous to Theorems 1 and 2 of Stein [5] for the case of loss function $(\theta - d)^2$. In Stein's theorems (i) is replaced by $E(X \mid Y) = 0$ and (ii) is replaced by $\int d\nu(y)\lbrack \int x^2 d_xP(x, y)\rbrack^{\frac{1}{2}} < \infty$. Thus, Stein's result, as well as ours, requires one moment more than is intuitively necessary. In Stein's case, contrary to ours, the extra moment is not required if $\mathscr{Y}$ contains only one element. Similar results for the case of hypothesis testing were obtained by Lehmann and Stein [4] under the assumption of a first moment, again one more moment than is intuitively needed. The proofs in Sections 2 and 3 are similar to those used by Stein. The method is originally due to Blyth [2]. Since the passage from our corollary to the case of a sample of size $n$ is similar to Stein's, it will be omitted here. Section 4 contains examples which show that the conclusion of the theorem cannot be strict admissibility. These examples are analogous to Blackwell's [1] example for the case of the loss function $(\theta - d)^2$.
Citation
Martin Fox. Herman Rubin. "Admissibility of Quantile Estimates of a Single Location Parameter." Ann. Math. Statist. 35 (3) 1019 - 1030, September, 1964. https://doi.org/10.1214/aoms/1177700518
Information