The Annals of Statistics

Rates of Convergence of Minimum Distance Estimators and Kolmogorov's Entropy

Yannis G. Yatracos

Abstract

Let $(\mathscr{X, A})$ be a space with a $\sigma$-field, $M = \{P_s; s \in \Theta\}$ be a family of probability measures on $\mathscr{A}$ with $\Theta$ arbitrary, $X_1, \cdots, X_n$ i.i.d. observations on $P_\theta.$ Define $\mu_n(A) = (1/n) \sum^n_{i = 1} I_A(X_i),$ the empirical measure indexed by $A \in \mathscr{A}.$ Assume $\Theta$ is totally bounded when metrized by the $L_1$ distance between measures. Robust minimum distance estimators $\hat{\theta}_n$ are constructed for $\theta$ and the resulting rate of convergence is shown naturally to depend on an entropy function for $\Theta$.

Article information

Source
Ann. Statist., Volume 13, Number 2 (1985), 768-774.

Dates
First available in Project Euclid: 12 April 2007

Permanent link to this document
https://projecteuclid.org/euclid.aos/1176349553

Digital Object Identifier
doi:10.1214/aos/1176349553

Mathematical Reviews number (MathSciNet)
MR790571

Zentralblatt MATH identifier
0576.62057

JSTOR