The Annals of Statistics

Rates of Convergence of Minimum Distance Estimators and Kolmogorov's Entropy

Yannis G. Yatracos

Full-text: Open access

Abstract

Let $(\mathscr{X, A})$ be a space with a $\sigma$-field, $M = \{P_s; s \in \Theta\}$ be a family of probability measures on $\mathscr{A}$ with $\Theta$ arbitrary, $X_1, \cdots, X_n$ i.i.d. observations on $P_\theta.$ Define $\mu_n(A) = (1/n) \sum^n_{i = 1} I_A(X_i),$ the empirical measure indexed by $A \in \mathscr{A}.$ Assume $\Theta$ is totally bounded when metrized by the $L_1$ distance between measures. Robust minimum distance estimators $\hat{\theta}_n$ are constructed for $\theta$ and the resulting rate of convergence is shown naturally to depend on an entropy function for $\Theta$.

Article information

Source
Ann. Statist., Volume 13, Number 2 (1985), 768-774.

Dates
First available in Project Euclid: 12 April 2007

Permanent link to this document
https://projecteuclid.org/euclid.aos/1176349553

Digital Object Identifier
doi:10.1214/aos/1176349553

Mathematical Reviews number (MathSciNet)
MR790571

Zentralblatt MATH identifier
0576.62057

JSTOR
links.jstor.org

Subjects
Primary: 62G05: Estimation
Secondary: 62G30: Order statistics; empirical distribution functions

Keywords
Minimum distance estimation rates of convergence Kolmogorov's entropy density estimation

Citation

Yatracos, Yannis G. Rates of Convergence of Minimum Distance Estimators and Kolmogorov's Entropy. Ann. Statist. 13 (1985), no. 2, 768--774. doi:10.1214/aos/1176349553. https://projecteuclid.org/euclid.aos/1176349553


Export citation