Many works in statistics aim at designing a universal estimation procedure, that is, an estimator that would converge to the best approximation of the (unknown) data generating distribution in a model, without any assumption on this distribution. This question is of major interest, in particular because the universality property leads to the robustness of the estimator. In this paper, we tackle the problem of universal estimation using a minimum distance estimator presented in (Briol et al. (2019)) based on the Maximum Mean Discrepancy. We show that the estimator is robust to both dependence and to the presence of outliers in the dataset. Finally, we provide a theoretical study of the stochastic gradient descent algorithm used to compute the estimator, and we support our findings with numerical simulations.
We would like to thank Guillaume Lecué (ENSAE Paris) for his helpful comments, Mathieu Gerber (University of Bristol) who fixed a mistake in the constants in Proposition 4.1, and George Wynne (Imperial College) for his very informative comments on the coefficients . We also would like to thank the anonymous Referees and the Associate Editor for their insightful comments that helped to improve the structure of the paper. All the remaining mistakes are ours.
"Finite sample properties of parametric MMD estimation: Robustness to misspecification and dependence." Bernoulli 28 (1) 181 - 213, February 2022. https://doi.org/10.3150/21-BEJ1338