Open Access
Translator Disclaimer
March, 1974 Fisher Information and the Pitman Estimator of a Location Parameter
Sidney C. Port, Charles J. Stone
Ann. Statist. 2(2): 225-247 (March, 1974). DOI: 10.1214/aos/1176342660


In this paper we consider estimation of the location parameter $\theta \in R^d$ based on a random sample from $(\theta + X, Y),$ where $X$ is a $d$-dimensional random vector, $Y$ is a random element of some measure space $\mathscr{Y},$ and $(X, Y)$ has a known distribution. We first define the Fisher information $\mathscr{J}(\theta + X, Y)$ and the inverse information $\mathscr{J}^-(\theta + X, Y)$ under no regularity conditions. The properties of these quantities are investigated. Supposing that $E|X|^\delta < \infty$ for some $\delta > 0$ we show that for $n$ sufficiently large the Pitman estimator $\hat{\theta}_n$ of $\theta$ based on a random sample of size $n$ is well defined, unbiased, and its covariance, which is independent of $\theta$, satisfies the inequality $n \operatorname{Cov} \hat{\theta}_n \geqq \mathscr{J}^-(\theta + X, Y)$. Moreover, $\lim_{n\rightarrow \infty} n \operatorname{Cov} \hat{\theta}_n = \mathscr{J}^-(\theta + X, Y)$ and $n^\frac{1}{2}(\hat{\theta}_n - \theta)$ is asymptotically normal with mean zero and covariance $\mathscr{J}^-(\theta + X, Y)$.


Download Citation

Sidney C. Port. Charles J. Stone. "Fisher Information and the Pitman Estimator of a Location Parameter." Ann. Statist. 2 (2) 225 - 247, March, 1974.


Published: March, 1974
First available in Project Euclid: 12 April 2007

zbMATH: 0297.62016
MathSciNet: MR362665
Digital Object Identifier: 10.1214/aos/1176342660

Primary: 62F10
Secondary: 62F20

Keywords: asymptotic normality , Cramer-Rao inequality , Fisher information , location parameter , Pitman estimator

Rights: Copyright © 1974 Institute of Mathematical Statistics


Vol.2 • No. 2 • March, 1974
Back to Top