The Annals of Mathematical Statistics

Certain Inequalities in Information Theory and the Cramer-Rao Inequality

S. Kullback

Full-text: Open access

Abstract

The Cramer-Rao inequality provides, under certain regularity conditions, a lower bound for the variance of an estimator [7], [15]. Various generalizations, extensions and improvements in the bound have been made, by Barankin [1], [2], Bhattacharyya [3], Chapman and Robbins [5], Fraser and Guttman [11], Kiefer [12], and Wolfowitz [16], among others. Further considerations of certain inequality properties of a measure of information, discussed by Kullback and Leibler [14], yields a greater lower bound for the information measure (formula (4.11)), and leads to a result which may be considered a generalization of the Cramer-Rao inequality, the latter following as a special case. The results are used to define discrimination efficiency and estimation efficiency at a point in parameter space.

Article information

Source
Ann. Math. Statist. Volume 25, Number 4 (1954), 745-751.

Dates
First available in Project Euclid: 28 April 2007

Permanent link to this document
http://projecteuclid.org/euclid.aoms/1177728660

Digital Object Identifier
doi:10.1214/aoms/1177728660

Mathematical Reviews number (MathSciNet)
MR65856

Zentralblatt MATH identifier
0057.35402

JSTOR
links.jstor.org

Citation

Kullback, S. Certain Inequalities in Information Theory and the Cramer-Rao Inequality. Ann. Math. Statist. 25 (1954), no. 4, 745--751. doi:10.1214/aoms/1177728660. http://projecteuclid.org/euclid.aoms/1177728660.


Export citation