The Annals of Statistics

Information Inequality Bounds on the Minimax Risk (with an Application to Nonparametric Regression)

Lawrence D. Brown and Mark G. Low

Full-text: Open access

Abstract

This paper compares three methods for producing lower bounds on the minimax risk under quadratic loss. The first uses the bounds from Brown and Gajek. The second method also uses the information inequality and results in bounds which are always at least as good as those from the first method. The third method is the hardest-linear-family method described by Donoho and Liu. These methods are applied in four examples, the last of which relates to a frequently considered problem in nonparametric regression.

Article information

Source
Ann. Statist., Volume 19, Number 1 (1991), 329-337.

Dates
First available in Project Euclid: 12 April 2007

Permanent link to this document
https://projecteuclid.org/euclid.aos/1176347985

Digital Object Identifier
doi:10.1214/aos/1176347985

Mathematical Reviews number (MathSciNet)
MR1091854

Zentralblatt MATH identifier
0736.62019

JSTOR
links.jstor.org

Subjects
Primary: 62F10: Point estimation
Secondary: 62F15: Bayesian inference 62C99: None of the above, but in this section 60E15: Inequalities; stochastic orderings

Keywords
Information inequality (Cramer-Rao inequality) minimax risk density estimation nonparametric regression estimating a bounded normal mean

Citation

Brown, Lawrence D.; Low, Mark G. Information Inequality Bounds on the Minimax Risk (with an Application to Nonparametric Regression). Ann. Statist. 19 (1991), no. 1, 329--337. doi:10.1214/aos/1176347985. https://projecteuclid.org/euclid.aos/1176347985


Export citation