Open Access
March, 1961 Optimum Designs in Regression Problems, II
J. Kiefer
Ann. Math. Statist. 32(1): 298-325 (March, 1961). DOI: 10.1214/aoms/1177705160

Abstract

Extending the results of Kiefer and Wolfowitz [10], [11], methods are obtained for characterizing and computing optimum regression designs in various settings, and examples are given where $D$-optimum designs are computed. In Section 1 we introduce the main definitions and notation which will be used in the paper, and discuss briefly the roles of invariance, randomization, number of points at which observations are taken, and nonlinearity of the model, in our results. In Section 2 we prove the main theoretical results. We are concerned with the estimation of $s$ out of the $k$ parameters, extending an approach developed in [10] and [11] in the case $s = k$. There is no direct way of ascertaining whether or not a given design $\xi^{\ast}$ is $D$-optimum for (minimizes the generalized variance of the best linear estimators of) the $s$ chosen parameters, and T+eorems 1 and 2 provide algorithms for determining whether or not a given $\xi^\ast$ is $D$-optimum. If all $k$ parameters are estimable under $\xi^\ast$, we can use (2.7) to decide whether $\xi^\ast$ is $D$-optimum, while if not all $k$ parameters are estimable we must use the somewhat more complicated condition (2.17) (of which part (a) or (b) is necessary for optimality, while (a), (c), or (d) is sufficient). An addition to Theorem 2 near the end of Section 3 provides assistance in using (2.17) (b). Theorem 3 of Section 2 characterizes the set of information matrices of the $D$-optimum designs. In Section 3 we give a geometric interpretation of the results of Section 2, and compare the present approach with that of [10]. In the case $s = k$, the present approach reduces to that of Section 5 of [10] and of [11]. When $1 < s < k$, we obtain an algorithm which differs from that of Section 4 of [10] and which appears to be computationally easier to use. When $s = 1$, the results of the present paper are shown to reduce to those of Section 2 of [10]; in particular, we obtain the game-theoretic results without using the game-theoretic machinery of [10]. In Section 4 we determine $D$-optimum designs for the problems of quadratic regression on a $q$-cube and polynomial regression on a real interval with $1 < s < k$. Part II of the paper is devoted entirely to the determination of $D$-optimum designs for various problems in the setting of simplex designs considered by Scheffe [12]. Various unsolved problems are mentioned throughout the paper. Further examples will be published elsewhere.

Citation

Download Citation

J. Kiefer. "Optimum Designs in Regression Problems, II." Ann. Math. Statist. 32 (1) 298 - 325, March, 1961. https://doi.org/10.1214/aoms/1177705160

Information

Published: March, 1961
First available in Project Euclid: 27 April 2007

zbMATH: 0099.13502
MathSciNet: MR123408
Digital Object Identifier: 10.1214/aoms/1177705160

Rights: Copyright © 1961 Institute of Mathematical Statistics

Vol.32 • No. 1 • March, 1961
Back to Top