Open Access
December, 1954 Certain Inequalities in Information Theory and the Cramer-Rao Inequality
S. Kullback
Ann. Math. Statist. 25(4): 745-751 (December, 1954). DOI: 10.1214/aoms/1177728660

Abstract

The Cramer-Rao inequality provides, under certain regularity conditions, a lower bound for the variance of an estimator [7], [15]. Various generalizations, extensions and improvements in the bound have been made, by Barankin [1], [2], Bhattacharyya [3], Chapman and Robbins [5], Fraser and Guttman [11], Kiefer [12], and Wolfowitz [16], among others. Further considerations of certain inequality properties of a measure of information, discussed by Kullback and Leibler [14], yields a greater lower bound for the information measure (formula (4.11)), and leads to a result which may be considered a generalization of the Cramer-Rao inequality, the latter following as a special case. The results are used to define discrimination efficiency and estimation efficiency at a point in parameter space.

Citation

Download Citation

S. Kullback. "Certain Inequalities in Information Theory and the Cramer-Rao Inequality." Ann. Math. Statist. 25 (4) 745 - 751, December, 1954. https://doi.org/10.1214/aoms/1177728660

Information

Published: December, 1954
First available in Project Euclid: 28 April 2007

zbMATH: 0057.35402
MathSciNet: MR65856
Digital Object Identifier: 10.1214/aoms/1177728660

Rights: Copyright © 1954 Institute of Mathematical Statistics

Vol.25 • No. 4 • December, 1954
Back to Top