We analyze the asymptotic behavior of maximum likelihood estimators (MLE) in convex dominated models when the true distribution generating the independent data does not necessarily belong to the model. Inspired by the Hellinger distance and its properties, we introduce a family of divergences (contrast functions) which allow a unified treatment of well- and misspecified convex models. Convergence and rates of convergence of the MLE with respect to our divergences are obtained from inequalities satisfied by these divergences and results from empirical process theory (uniform laws of large numbers and maximal inequalities). As a particular case we recover existing results for Hellinger convergence of MLE in well-specified convex models. Four examples are considered: mixtures of discrete distributions, monotone densities, decreasing failure rate distributions and a finite-dimensional parametric model.
"Convex Models, MLS and Misspecification." Ann. Statist. 29 (1) 94 - 123, February 2001. https://doi.org/10.1214/aos/996986503