Abstract
Robustness and efficiency of a parameter estimate $T$ can be assessed by comparing the fitted parametric distribution $P_T$ with the actual distribution, which is assumed to lie near the parametric family $\{P_\theta:\theta\in\Theta\}$. Asymptotic lower bounds are established for the minimax risk over distributions near the parametric model, taking as loss function a monotone increasing function of the Hellinger distance between the actual distribution of the sample and the fitted distribution determined by $T$. The set of marginal distributions considered in the minimax calculation is a subset of the Hellinger ball of radius $O(n^{-1/2})$ centered at $P_\theta, n$ being the sample size. When the loss function is bounded, the lower bound on maximum risk can be attained asymptotically. However, an estimator of $\theta$ which is asymptotically minimax for bounded loss functions may be far from optimal when the loss function is unbounded. Such divergent behavior is exhibited, for instance, by the sample mean in nearly normal models.
Citation
Rudolf Beran. "Asymptotic Lower Bounds for Risk in Robust Estimation." Ann. Statist. 8 (6) 1252 - 1264, November, 1980. https://doi.org/10.1214/aos/1176345198
Information