The Annals of Statistics

Geometrizing Rates of Convergence, III

David L. Donoho and Richard C. Liu

Full-text: Open access

Abstract

We establish upper and lower bounds on the asymptotic minimax risk in estimating (1) a density at a point when the density is known to be decreasing with a Lipschitz condition; (2) a density at a point when the density satisfies a local second-order smoothness (Sacks-Ylvisaker) condition; and (3) the $k$th derivative of the density at a point, when the density satisfies a local $L_p$ constraint on the $m$th derivative. In (1), (2) and (3) the upper and lower bounds differ asymptotically by less than 18%, 24.3% and 25%, respectively. Our bounds on the asymptotic minimax risk come from a simple formula. Let $\omega(\varepsilon)$ denote the modulus of continuity, with respect to Hellinger distance, of the functional to be estimated; in the previous cases this has the form $\omega(\varepsilon) = A\varepsilon^r(1 + o(1))$ for certain constants $A$ and $r$. Then, in all these cases, the minimax risk is not larger asymptotically than $r^r(1 - r)^{1 - r}\omega^2(n^{-1/2})/4$ and is at best a few percent smaller. The modulus of continuity of the functional and hence the geometry of the problem, determine the difficulty of estimation. At a technical level, two interesting aspects of our work are (1) derivation of minimax affine estimates of a linear functional in the white noise model with general convex asymmetric a priori class and (2) the use of Le Cam's theory of convergence of experiments to show that the density model is asymptotically just as hard as the white noise model. At a conceptual level, an interesting aspect of our work is the use of the hardest one-dimensional subproblem heuristic. Our method works because in these cases, the difficulty of the hardest one-dimensional subproblem is essentially equal to the difficulty of the full infinite-dimensional problem.

Article information

Source
Ann. Statist., Volume 19, Number 2 (1991), 668-701.

Dates
First available in Project Euclid: 12 April 2007

Permanent link to this document
https://projecteuclid.org/euclid.aos/1176348115

Digital Object Identifier
doi:10.1214/aos/1176348115

Mathematical Reviews number (MathSciNet)
MR1105839

Zentralblatt MATH identifier
0754.62029

JSTOR
links.jstor.org

Subjects
Primary: 62G20: Asymptotic properties
Secondary: 62G05: Estimation 62F35: Robustness and adaptive procedures

Keywords
White noise model density estimation rates of convergence modulus of continuity minimax risk estimating a bounded normal mean optimal kernels convergence of experiments geodesic experiments Ibragimov-Has'minskii constant

Citation

Donoho, David L.; Liu, Richard C. Geometrizing Rates of Convergence, III. Ann. Statist. 19 (1991), no. 2, 668--701. doi:10.1214/aos/1176348115. https://projecteuclid.org/euclid.aos/1176348115


Export citation

See also

  • Part III: David L. Donoho, Richard C. Liu. Geometrizing Rates of Convergence, II. Ann. Statist., Volume 19, Number 2 (1991), 633--667.