Abstract
We study estimation of the parameters of a Gaussian linear model $\mathscr{M}_0$ when we entertain the possibility that $\mathscr{M}_0$ is invalid and a larger model $\mathscr{M}_1$ should be assumed. Estimates are robust if their maximum risk over $\mathscr{M}_1$ is finite and the most robust estimate is the least squares estimate under $\mathscr{M}_1$. We apply notions of Hodges and Lehmann (1952) and Efron and Morris (1971) to obtain (biased) estimates which do well under $\mathscr{M}_0$ at a small price in robustness. Extensions to confidence intervals, simultaneous estimation of several parameters and large sample approximations applying to nested parametric models are also discussed.
Citation
P. J. Bickel. "Parametric Robustness: Small Biases can be Worthwhile." Ann. Statist. 12 (3) 864 - 879, September, 1984. https://doi.org/10.1214/aos/1176346707
Information