Translator Disclaimer
March, 1990 On Minimax Estimation in the Presence of Side Information About Remote Data
R. Ahlswede, M. V. Burnashev
Ann. Statist. 18(1): 141-171 (March, 1990). DOI: 10.1214/aos/1176347496


We analyze the following model: One person, called "helper" observes an outcome $x^n = (x_1, \cdots, x_n) \in \mathscr{X}^n$ of the sequence $X^n = (X_1, \cdots, X_n)$ of i.i.d. RV's and the statistician gets a sample $y^n = (y_1, \cdots, y_n)$ of the sequence $Y^n(\theta, x^n)$ of RV's with a density $\prod^n_{t = 1} f(y_t \mid \theta, x_t)$. The helper can give some (side) information about $x^n$ to the statistician via an encoding function $s_n: \mathscr{X}^n \rightarrow \mathbb{N}$ with rate($s_n)^{def}{=}(1/n)\log {\tt\#}$ range($s_n) \leq R$. Based on the knowledge of $s_n(x^n)$ and $y^n$ the statistician tries to estimate $\theta$ by an estimator $\hat{\theta}_n$. For the maximal mean square error $e_n(R) =^{def} \inf_{\hat\theta_n} \inf_{s_n: \text{rate}}(s_n) \leq R \sup_{\theta \in \Theta} E_\theta|\hat{\theta}_n - \theta|^2$ we establish a Cramer-Rao type bound and, in case of a finite $\mathscr{X}$, prove asymptotic achievability of this bound under certain conditions. The proof involves a nonobvious combination of results (some of which are novel) for both coding and estimation.


Download Citation

R. Ahlswede. M. V. Burnashev. "On Minimax Estimation in the Presence of Side Information About Remote Data." Ann. Statist. 18 (1) 141 - 171, March, 1990.


Published: March, 1990
First available in Project Euclid: 12 April 2007

zbMATH: 0712.62023
MathSciNet: MR1041389
Digital Object Identifier: 10.1214/aos/1176347496

Primary: 62A99
Secondary: 62F12, 62N99, 94A15

Rights: Copyright © 1990 Institute of Mathematical Statistics


Vol.18 • No. 1 • March, 1990
Back to Top