## The Annals of Statistics

### On Moderate Deviation Theory in Estimation

Wilbert C. M. Kallenberg

#### Abstract

The performance of a sequence of estimators $\{T_n\}$ of $\theta$ can be measured by the probability concentration of the estimator in an $\varepsilon_n$-neighborhood of $\theta$. Classical choices of $\varepsilon_n$ are $\varepsilon_n = cn^{-1/2}$ (contiguous case) and $\varepsilon_n = \varepsilon$ fixed for all $n$ (non-local case). In this article all sequences $\{\varepsilon_n\}$ with $\lim_{n\rightarrow\infty} \varepsilon_n = 0$ and $\lim_{n\rightarrow\infty} \varepsilon_nn^{1/2} = \infty$ are considered. In that way the statistically important choices of small $\varepsilon$'s are investigated in a uniform sense; in that way the importance and usefulness of classical results concerning local or non-local efficiency can gather strength by extending to larger regions of neighborhoods; in that way one can investigate where optimality passes into non-optimality if for instance an estimator is locally efficient and non-locally non-efficient. The theory of moderate deviation and Cramer-type large deviation probabilities plays an important role in this context. Examples of the performance of particularly maximum likelihood estimators are presented in $k$-parameter exponential families, a curved exponential family and the double-exponential family.

#### Article information

Source
Ann. Statist., Volume 11, Number 2 (1983), 498-504.

Dates
First available in Project Euclid: 12 April 2007

https://projecteuclid.org/euclid.aos/1176346156

Digital Object Identifier
doi:10.1214/aos/1176346156

Mathematical Reviews number (MathSciNet)
MR696062

Zentralblatt MATH identifier
0515.62027

JSTOR