Open Access
March, 1994 On Minimax Estimation of a Sparse Normal Mean Vector
Iain M. Johnstone
Ann. Statist. 22(1): 271-289 (March, 1994). DOI: 10.1214/aos/1176325368

Abstract

Mallows has conjectured that among distributions which are Gaussian but for occasional contamination by additive noise, the one having least Fisher information has (two-sided) geometric contamination. A very similar problem arises in estimation of a nonnegative vector parameter in Gaussian white noise when it is known also that most [i.e., $(1 - \varepsilon)$] components are zero. We provide a partial asymptotic expansion of the minimax risk as $\varepsilon \rightarrow 0$. While the conjecture seems unlikely to be exactly true for finite $\varepsilon$, we verify it asymptotically up to the accuracy of the expansion. Numerical work suggests the expansion is accurate for $\varepsilon$ as large as 0.05. The best $l_1$-estimation rule is first- but not second-order minimax. The results bear on an earlier study of maximum entropy estimation and various questions in robustness and function estimation using wavelet bases.

Citation

Download Citation

Iain M. Johnstone. "On Minimax Estimation of a Sparse Normal Mean Vector." Ann. Statist. 22 (1) 271 - 289, March, 1994. https://doi.org/10.1214/aos/1176325368

Information

Published: March, 1994
First available in Project Euclid: 11 April 2007

zbMATH: 0816.62007
MathSciNet: MR1272083
Digital Object Identifier: 10.1214/aos/1176325368

Subjects:
Primary: 62C20
Secondary: 62C10 , 62G05

Keywords: Fisher information , least favorable prior , minimax decision theory , nearly black object , robustness , White noise model

Rights: Copyright © 1994 Institute of Mathematical Statistics

Vol.22 • No. 1 • March, 1994
Back to Top