Open Access
April, 1968 Estimation of Two Ordered Translation Parameters
Saul Blumenthal, Arthur Cohen
Ann. Math. Statist. 39(2): 517-530 (April, 1968). DOI: 10.1214/aoms/1177698414

Abstract

Let the random variables $X_{i1}, X_{i2}, \cdots, X_{in}, i = 1, 2$, be real valued and independent with density functions $f(x - \theta_i) (\theta_i \text{real}), i = 1, 2$, (with respect to Lebesgue measure). We take $\int^\infty_{-\infty} xf(x) dx = 0$ with no loss of generality. The problem is to estimate the ordered pair $(\theta_1, \theta_2)$, under the condition $\theta_2 \geqq \theta_1$, when the loss function is the sum of the squared errors in estimating the individual components. Questions of minimaxity and admissibility of the analogue of the Pitman estimator are considered. This problem, which represents a two dimensional estimation problem subject to constraints, has received attention in the past. Most of the literature deals with obtaining maximum likelihood estimates for specified densities. (See for example Brunk [3].) Katz [6] considers some aspects of the problem for the binomial and normal densities. The analogue of the Pitman estimator studied here is the vector estimator which is the a posteriori expected value of $(\theta_1, \theta_2)$ given $X_{ij}, i = 1, 2, j = 1, 2, \cdots, n$, when the generalized prior distribution is the uniform distribution on the half space $\theta_2 \geqq \theta_1$. If we call this estimator $\delta = (\delta_1, \delta_2)$, then \begin{equation*}\begin{align*}\tag{1.1}\delta_i(X_{11}, X_{12}, \cdots, X_{2n}) \\ = \iint_{\theta_2 \geqq \theta_1} \theta_i \prod^n_{j = 1} f(X_{1j} - \theta_1) \prod^n_{j = 1}f(X_{2j} - \theta_2) d\theta_1 d\theta_2 \\ \cdot\lbrack\iint_{\theta_2 \geqq \theta_1} \prod^n_{j = 1}f(X_{1j} - \theta_1) \prod^n_{j = 1}f(X_{2j} - \theta_2) d\theta_1 d\theta_2\rbrack^{-1},\quad i = 1, 2\end{align*}.\end{equation*} In order to state the main results, it is convenient to introduce some notation. Let \begin{equation*}\tag{1.2}X_i = \int \theta_i \prod^n_{j = 1}f(X_{ij} - \theta_i) d\theta_i/\int \prod^n_{j = 1}f(X_{ij} - \theta_i) d\theta_i,\quad i = 1, 2.\end{equation*} Let $Y_i = (Y_{i1}, Y_{i2}, \cdots, Y_{i, n - 1})$ where $Y_{ij} = X_{i,j + 1} - X_{i1}$. Let $p(x, y)$ be the conditional density of $X_i$ given $Y_i$ when $\theta_i = 0$. We obtain the following results: (a) If $EE\lbrack(X^2_1 + X^2_2)\mid Y_1, Y_2\rbrack < \infty$, and if $p(x, y) = p(-x, y)$ then the Pitman estimator $(\delta_1, \delta_2)$ given in (1.1) is minimax. The normal and uniform densities are examples of when this condition is satisfied. (b) Let $P(x, y)$ denote the cumulative distribution function corresponding to $p(x, y)$. That is, for a fixed $y$, $P(x, y) = \int^x_{-\infty}p(u, y) du.\quad\text{If}\quad EE\lbrack(X^2_1 + X^2_2)\mid Y_1, Y_2\rbrack < \infty$ and if $p(x, y)$ is such that for each $y, p(x, y)/(1 - P(x, y))$ increases in $x$, (i.e. increasing hazard rate) and $p(x, y)/P(x, y)$ decreases in $x$, then the Pitman estimator is minimax. The family of gamma densities, $f(t) = t^\alpha \exp (-t)/\Gamma(\alpha + 1)$ for $\alpha > 0$, are examples of when the above condition is satisfied. The proofs of the minimax results (a) and (b) are based essentially on the method of Farrell [4]. (c) An example is given which indicates that in general the Pitman estimator is not minimax. The example is justified by a computation performed by numerical integration. The numerical integration shows that the risk of the Pitman estimator exceeds the risk of an estimator known to be minimax. The results (a) (b), and (c) indicate that whereas in a related one dimensional problem, namely to estimate a translation parameter $\theta$, subject to $\theta \geqq 0$, (see Farrell [4], Section 7), the Pitman estimator is always minimax (save for moment and continuity conditions), the same is not true for this two dimensional problem. (d) Let \begin{equation*}\tag{1.3}\rho(y) = \max \{\sup_{-\infty < x < -1} \lbrack\int^\infty_{-\infty} v dv \int^\infty_{-\infty} p(u - v, y_1) \cdot p(u + v, y_2) du/x \int^x_{-\infty} \int^\infty_{-\infty} p(u - v, y_1)p(u + v, y_2) du dv\rbrack, 2\}.\end{equation*} If \begin{equation*}\tag{1.4}E\rho^2(y)E\lbrack(X^4_1 + X^4_2) \cdot(1 + |\log (X^2_1 + X^2_2)|^\beta|Y_1, Y_2\rbrack < \infty,\quad\text{for some} \beta > 0,\end{equation*} then the Pitman estimator given in (1.1) is admissible. The normal density is an example for which (1.4) holds. Whereas Katz [6] stated the admissibility result for the normal case, the proof there is not adequate. The proof given here is based on results of Stein [9], and James and Stein [5]. In the next section we give notation. The minimax results are given in Section 3, and admissibility in Section 4.

Citation

Download Citation

Saul Blumenthal. Arthur Cohen. "Estimation of Two Ordered Translation Parameters." Ann. Math. Statist. 39 (2) 517 - 530, April, 1968. https://doi.org/10.1214/aoms/1177698414

Information

Published: April, 1968
First available in Project Euclid: 27 April 2007

MathSciNet: MR223007
Digital Object Identifier: 10.1214/aoms/1177698414

Rights: Copyright © 1968 Institute of Mathematical Statistics

Vol.39 • No. 2 • April, 1968
Back to Top