Abstract
Optimal Bayesian experimental designs for estimation and prediction in linear models are discussed. The designs are optimal for estimating a linear combination of the regression parameters $\mathbf{c}^T\theta$ or prediction at a point where the expected response is $\mathbf{c}^T\mathbf{\theta}$ under squared error loss. A distribution on $\mathbf{c}$ is introduced to represent the interest in particular linear combinations of the parameters. In the usual notation for linear models minimizing the preposterior expected loss leads to minimizing the quantity $\mathrm{tr}\psi(R + XX^T)^{-1}$. The matrix $\psi$ is defined to be $E(\mathbf{cc}^T)$ and the matrix $R$ is the prior precision matrix of $\theta$. A geometric interpretation of the optimal designs is given which leads to a parallel of Elfving's theorem for $\mathbf{c}$-optimality. A bound is given for the minimum number of points at which it is necessary to take observations. Some examples of optimal Bayesian designs are given and optimal designs for prediction in polynomial regression are derived. The optimality of rounding non-integer designs to integer designs is discussed.
Citation
Kathryn Chaloner. "Optimal Bayesian Experimental Design for Linear Models." Ann. Statist. 12 (1) 283 - 300, March, 1984. https://doi.org/10.1214/aos/1176346407
Information