We consider the problem of robust inference for the binomial $(m, \pi)$ model. The discreteness of the data and the fact that the parameter and sample spaces are bounded mean that standard robustness theory gives surprising results. For example, the maximum likelihood estimator (MLE) is quite robust, it cannot be improved on for $m=1$ but can be for $m>1$. We discuss four other classes of estimators: $M$-estimators, minimum disparity estimators, optimal MGP estimators, and a new class of estimators which we call $E$-estimators. We show that $E$-estimators have a non-standard asymptotic theory which challenges the accepted relationships between robustness concepts and thereby provides new perspectives on these concepts.
"Robust fitting of the binomial model." Ann. Statist. 29 (4) 1117 - 1136, August 2001. https://doi.org/10.1214/aos/1013699996