Open Access
December 1997 Mutual information, metric entropy and cumulative relative entropy risk
David Haussler, Manfred Opper
Ann. Statist. 25(6): 2451-2492 (December 1997). DOI: 10.1214/aos/1030741081

Abstract

Assume ${P_{\theta}: \theta \epsilon \Theta}$ is a set of probability distributions with a common dominating measure on a complete separable metric space Y. A state $\theta^* \epsilon \Theta$ is chosen by Nature. A statistician obtains n independent observations $Y_1, \dots, Y_n$ from Y distributed according to $P_{\theta^*}$. For each time t between 1 and n, based on the observations $Y_1, \dots, Y_{t-1}$, the statistician produces an estimated distribution $\hat{P}_t$ for $P_{\theta^*}$ and suffers a loss $L(P_{\theta^*}, \hat{P}_t)$. The cumulative risk for the statistician is the average total loss up to time n. Of special interest in information theory, data compression, mathematical finance, computational learning theory and statistical mechanics is the special case when the loss $L(P_{\theta^*}, \hat{P}_t)$ is the relative entropy between the true distribution $P_{\theta^*}$ and the estimated distribution $\hat{P}_t$. Here the cumulative Bayes risk from time 1 to n is the mutual information between the random parameter $\Theta^*$ and the observations $Y_1, \dots, Y_n$.

New bounds on this mutual information are given in terms of the Laplace transform of the Hellinger distance between pairs of distributions indexed by parameters in $\Theta$. From these, bounds on the cumulative minimax risk are given in terms of the metric entropy of $\Theta$ with respect to the Hellinger distance. The assumptions required for these bounds are very general and do not depend on the choice of the dominating measure. They apply to both finite- and infinite-dimensional $\Theta$. They apply in some cases where Y is infinite dimensional, in some cases where Y is not compact, in some cases where the distributions are not smooth and in some parametric cases where asymptotic normality of the posterior distribution fails.

Citation

Download Citation

David Haussler. Manfred Opper. "Mutual information, metric entropy and cumulative relative entropy risk." Ann. Statist. 25 (6) 2451 - 2492, December 1997. https://doi.org/10.1214/aos/1030741081

Information

Published: December 1997
First available in Project Euclid: 30 August 2002

zbMATH: 0920.62007
MathSciNet: MR1604481
Digital Object Identifier: 10.1214/aos/1030741081

Subjects:
Primary: 62G07
Secondary: 62B10 , 62C20 , 94A29

Keywords: Bayes risk , Density estimation , Hellinger distance , Kullback-Leibler distance , Metric entropy , minimax risk , mutual information , Relative entropy

Rights: Copyright © 1997 Institute of Mathematical Statistics

Vol.25 • No. 6 • December 1997
Back to Top