The Annals of Statistics

On Kullback-Leibler Loss and Density Estimation

Peter Hall

Full-text: Open access

Abstract

"Discrimination information," or Kullback-Leibler loss, is an appropriate measure of distance in problems of discrimination. We examine it in the context of nonparametric kernel density estimation and show that its asymptotic properties are profoundly influenced by tail properties of the kernel and of the unknown density. We suggest ways of choosing the kernel so as to reduce loss, and describe the extent to which likelihood cross-validation asymptotically minimises loss. Likelihood cross-validation generally leads to selection of a window width of the correct order of magnitude, but not necessarily to a window with the correct first-order properties. However, if the kernel is chosen appropriately, then likelihood cross-validation does result in asymptotic minimisation of Kullback-Leibler loss.

Article information

Source
Ann. Statist. Volume 15, Number 4 (1987), 1491-1519.

Dates
First available: 12 April 2007

Permanent link to this document
http://projecteuclid.org/euclid.aos/1176350606

JSTOR
links.jstor.org

Digital Object Identifier
doi:10.1214/aos/1176350606

Mathematical Reviews number (MathSciNet)
MR913570

Zentralblatt MATH identifier
0678.62045

Subjects
Primary: 62G99: None of the above, but in this section
Secondary: 62H99: None of the above, but in this section

Keywords
Density estimation discrimination kernel method Kullback-Leibler loss likelihood cross-validation

Citation

Hall, Peter. On Kullback-Leibler Loss and Density Estimation. The Annals of Statistics 15 (1987), no. 4, 1491--1519. doi:10.1214/aos/1176350606. http://projecteuclid.org/euclid.aos/1176350606.


Export citation