Abstract
Let $M_0$ be a normal linear regression model and let $M_1,\cdots, M_K$ be distinct proper linear submodels of $M_0$. Let $\hat k \in \{0,\cdots, K\}$ be a model selection rule based on observed data from the true model. Given $\hat k$, let the unknown parameters of the selected model $M_{\hat k}$ be fitted by the maximum likelihood method. A loss function is introduced which depends additively on two parts: (i) a measure of the difference between the fitted model $M_{\hat k}$ and the true model; and (ii) a measure $C_{\hat k}$ of the "complexity" of the selected model. A natural model selection rule $\bar{k}$, which minimizes an empirical version of this loss, is shown to be admissible and very nearly Bayes.
Citation
Charles J. Stone. "Admissible Selection of an Accurate and Parsimonious Normal Linear Regression Model." Ann. Statist. 9 (3) 475 - 485, May, 1981. https://doi.org/10.1214/aos/1176345452
Information