Abstract
In this paper we consider estimation of the location parameter $\theta \in R^d$ based on a random sample from $(\theta + X, Y),$ where $X$ is a $d$-dimensional random vector, $Y$ is a random element of some measure space $\mathscr{Y},$ and $(X, Y)$ has a known distribution. We first define the Fisher information $\mathscr{J}(\theta + X, Y)$ and the inverse information $\mathscr{J}^-(\theta + X, Y)$ under no regularity conditions. The properties of these quantities are investigated. Supposing that $E|X|^\delta < \infty$ for some $\delta > 0$ we show that for $n$ sufficiently large the Pitman estimator $\hat{\theta}_n$ of $\theta$ based on a random sample of size $n$ is well defined, unbiased, and its covariance, which is independent of $\theta$, satisfies the inequality $n \operatorname{Cov} \hat{\theta}_n \geqq \mathscr{J}^-(\theta + X, Y)$. Moreover, $\lim_{n\rightarrow \infty} n \operatorname{Cov} \hat{\theta}_n = \mathscr{J}^-(\theta + X, Y)$ and $n^\frac{1}{2}(\hat{\theta}_n - \theta)$ is asymptotically normal with mean zero and covariance $\mathscr{J}^-(\theta + X, Y)$.
Citation
Sidney C. Port. Charles J. Stone. "Fisher Information and the Pitman Estimator of a Location Parameter." Ann. Statist. 2 (2) 225 - 247, March, 1974. https://doi.org/10.1214/aos/1176342660
Information