The Annals of Statistics
- Ann. Statist.
- Volume 34, Number 5 (2006), 2180-2210.
From ɛ-entropy to KL-entropy: Analysis of minimum information complexity density estimation
Abstract
We consider an extension of ɛ-entropy to a KL-divergence based complexity measure for randomized density estimation methods. Based on this extension, we develop a general information-theoretical inequality that measures the statistical complexity of some deterministic and randomized density estimators. Consequences of the new inequality will be presented. In particular, we show that this technique can lead to improvements of some classical results concerning the convergence of minimum description length and Bayesian posterior distributions. Moreover, we are able to derive clean finite-sample convergence bounds that are not obtainable using previous approaches.
Article information
Source
Ann. Statist. Volume 34, Number 5 (2006), 2180-2210.
Dates
First available in Project Euclid: 23 January 2007
Permanent link to this document
http://projecteuclid.org/euclid.aos/1169571794
Digital Object Identifier
doi:10.1214/009053606000000704
Mathematical Reviews number (MathSciNet)
MR2291497
Zentralblatt MATH identifier
1106.62005
Subjects
Primary: 62C10: Bayesian problems; characterization of Bayes procedures 62G07: Density estimation
Keywords
Bayesian posterior distribution minimum description length density estimation
Citation
Zhang, Tong. From ɛ -entropy to KL-entropy: Analysis of minimum information complexity density estimation. Ann. Statist. 34 (2006), no. 5, 2180--2210. doi:10.1214/009053606000000704. http://projecteuclid.org/euclid.aos/1169571794.

