## Abstract

We shall use the notation of [1], Sections 2 and 3. Briefly, multiple decision problems (m.d.p.'s) of Type 1 have the following formulation. On the basis of the outcome of a random variable $(\operatorname{rv}) X$, which takes on values in the space $\mathscr{H}$, one element of the finite decision space $\mathscr{D} = \{d_0, d_1,\cdots, d_n\}$ is to be selected. The $\operatorname{rv} X$ has a density $p_\theta$ w.r.t. some $\sigma$-finite measure $\mu$ for some $\theta\in\Omega$. The parameter space $\Omega$ is partitioned into $(m + 1)$ disjoint subsets $\Omega = \Omega_0 \cup \cdots \cup\Omega_m$ and $\Omega_1 \cup \cdots \cup\Omega_m$ is denoted by $\Omega'$. The loss function $L: \Omega'x\mathscr{D} \rightarrow \lbrack 0, \infty)$ with $L(\theta, d) = w_{ij} \geqq 0$ for $d = d_j$ and $\theta\in\Omega_i (i = 1, \cdots, m; j = 0, \cdots, n)$ is constant over each $\Omega_i$ and is not defined for $\theta\in\Omega_0$. Type 1 m.d.p.'s satisfy the following assumptions: (i) $\Omega \subset R^s$, (ii) $\lbrack \Omega_1 \rbrack \cap \cdots \cap \lbrack \Omega_m \rbrack = \Omega_0' \neq \varnothing$, where $\lbrack \rbrack$ denotes closure w.r.t. the usual topology, (iii) for every test function $\varphi, E_\theta\lbrack\varphi(X) \rbrack$ is continuous in $\theta$. A m.d.p. of Type 1 will be called regular, if it satisfies the additional assumption (iv) for every test function $\varphi, E_{\theta_0}\lbrack \varphi(X) \rbrack = 0$ for all $\theta_0\in\Omega_0'$ implies that $E_\theta\lbrack \varphi(X) \rbrack = 0$ for all $\theta\in\Omega'$. A procedure $\delta = \delta(\varphi_0,\cdots, \varphi_n)$ is unbiased if and only if for $i = 1,\cdots, m$ the following inequalities hold \begin{equation*}\tag{1} \sum^n_{j=0} w_{hj} E_\theta \lbrack \varphi_j(X) \rbrack \geqq \sum^n_{j=0} w_{ij} E_\theta \lbrack \varphi_j(X) \rbrack,\quad\theta\in\Omega_i; h = 1, \cdots, m.\end{equation*} Let $W(M)$ denote the class of all unbiased (minimax risk) procedures for a m.d.p. of Type 1. Procedures with the same risk function will be identified. Let $w_{\cdot j}$ denote the point $(w_{lj},\cdots, w_{mj})$ of $R^m$ and let $S$ denote the convex hull of $w_{\cdot j}(j = 0, \cdots, n)$. Finally, let $E$ denote the set of points $e\in S$ with $e_1 = \cdots = e_m$. In [1] Theorem 4.1 it was proved that for problems of Type 1, a sufficient condition for $W \subset M$ is that there exists a point $e\in E$ that is both a minimax and a maximum point of $S$. This note establishes the following result. THEOREM. For regular problems of Type 1, necessary and sufficient for $W \subset M$ is that one of the following conditions holds: (i) $E = \varnothing$, (ii) $E$ consists of exactly one point $e$ and this is a minimax point of $S$. The following non-trivial example shows that for a regular m.d.p. of Type 1 the sufficient condition of Theorem 4.1 [1] is not necessary. Take $m = 3, n = 2, w_{\cdot 0} = (3, - 3, 0), w_{\cdot1} = (-3, 3, 0), w_{\cdot 2} = (6, -4, 6)$. Then $e = (0, 0, 0) = \frac{1}{2} w_{\cdot 0} + \frac{1}{2} w_{\cdot 1}$ is the unique convex representation with all coordinates equally large. Moreover, $e$ is a minimax point, so that condition (ii) of the Theorem is satisfied, but $e$ is not a maximin point, since the point $3w_{\cdot 1}/5 + 2w_{\cdot 2}/5 = (3/5, 1/5, 12/5)$ has all coordinates strictly larger than zero.

## Citation

Wilhelmine Stefansky. "A Necessary and Sufficient Condition that for Regular Multiple Decision Problems of Type I Every Unbiased Procedure Has Minimax Risk." Ann. Math. Statist. 41 (2) 736 - 738, April, 1970. https://doi.org/10.1214/aoms/1177697126

## Information