The paper studies Wald's minimax risk (M.R.) criterion and Lehmann's unbiasedness condition for a very general class of Type I problems (Section 2) which contains nearly all nonsequential multiple decision problems (m.d.p.'s) from parametric statistics (at least when no reasonable a priori knowledge is available--apart from knowledge restricting the parameter space in, for example, one-sided problems or in trend situations--and provided that the problems can be formulated by means of a loss function which is constant for the various kinds of error.) Type I problems turn out to behave in a degenerate way. Generally the M.R. procedure is not unique (Situation 1, see Theorem 3.1 and the Sections 5, 11, 12); when a unique M.R. procedure exists, then this is trivial and useless (Situation 2, see Corollary 3.1 and Sections $6, \cdots, 10$). We try to remedy this by applying Lehmann's unbiasedness condition (Section 4). This has to be done cautiously for the unbiasedness condition might be too restrictive: sometimes the class $W$ of all unbiased procedures is so small (Corollary 4.1) that $W = \varnothing$ (Theorem 8.1(i)) or such that $W$ contains only poor and useless procedures (Theorem 8.1 (iii) and Section 12). Fortunately we can show for some problems in Situation 1 that $W \subset M$ where $M$ denotes the class of all M. R. procedures (cf. Theorem 4.1; the unbiasedness restriction seems to be very attractive when the sufficient conditions of Lemma 4.2 are satisfied, see Theorem 5.1 where $W = M,$ Lemma 7.1 (ii) and Theorem 11.1; in Section 12 we also have $W \subset M$ but nevertheless the unbiasedness restriction is not attractive). For problems in Situation 2 the unique M.R. procedure $\delta^\ast$ is trivial and useless. Nevertheless we regard it as an advantage of the unbiasedness restriction when $\delta^\ast \varepsilon W$ (Sections 6, $\cdots$, 10), whereas $\delta^\ast \not\in W$ is regarded as an indication that $W$ might be too small (Theorem 8.1 (iv), Remark 8.1 and Lemma 10.1). We always try to obtain the procedure with "the most attractive appropriate optimum property." As shown by Lehmann ordinary two-decision testing problems (Section 5) and products of such problems (Section 11) do not present extensive difficulties because the unbiasedness restriction is attractive (Theorems 5.1 and 11.1) and reduces our m.d.p.'s to problems in the Neyman-Pearson formulation. Many three-decision two-sided problems are solved (Section 6) though we have to be content with criterions which are not very compelling. Similar results are obtained for some slippage problems (Sections 7, $\cdots$, 9) where the "optimum" procedure turns out to be not of the "natural" form considered in literature. For many m.d.p.'s of Type I (except for those in Sections 5, 6 and 11) the actual construction of the "optimum" procedure is forbidding and we have to content ourselves with results relating the classes $W$ and $M$ (cf. Sections 7, $\cdots$, 10, 12; the problems can be solved only in very simple special cases).
Willem Schaafsma. "Minimax Risk and Unbiasedness for Multiple Decision Problems of Type I." Ann. Math. Statist. 40 (5) 1684 - 1720, October, 1969. https://doi.org/10.1214/aoms/1177697382