Open Access
December, 1987 On Kullback-Leibler Loss and Density Estimation
Peter Hall
Ann. Statist. 15(4): 1491-1519 (December, 1987). DOI: 10.1214/aos/1176350606

Abstract

"Discrimination information," or Kullback-Leibler loss, is an appropriate measure of distance in problems of discrimination. We examine it in the context of nonparametric kernel density estimation and show that its asymptotic properties are profoundly influenced by tail properties of the kernel and of the unknown density. We suggest ways of choosing the kernel so as to reduce loss, and describe the extent to which likelihood cross-validation asymptotically minimises loss. Likelihood cross-validation generally leads to selection of a window width of the correct order of magnitude, but not necessarily to a window with the correct first-order properties. However, if the kernel is chosen appropriately, then likelihood cross-validation does result in asymptotic minimisation of Kullback-Leibler loss.

Citation

Download Citation

Peter Hall. "On Kullback-Leibler Loss and Density Estimation." Ann. Statist. 15 (4) 1491 - 1519, December, 1987. https://doi.org/10.1214/aos/1176350606

Information

Published: December, 1987
First available in Project Euclid: 12 April 2007

zbMATH: 0678.62045
MathSciNet: MR913570
Digital Object Identifier: 10.1214/aos/1176350606

Subjects:
Primary: 62G99
Secondary: 62H99

Keywords: Density estimation , discrimination , kernel method , Kullback-Leibler loss , likelihood cross-validation

Rights: Copyright © 1987 Institute of Mathematical Statistics

Vol.15 • No. 4 • December, 1987
Back to Top