Open Access
December, 1971 Epsilon Entropy and Data Compression
Edward C. Posner, Eugene R. Rodemich
Ann. Math. Statist. 42(6): 2079-2125 (December, 1971). DOI: 10.1214/aoms/1177693077

Abstract

This article studies efficient data transmission, or "data compression", from the standpoint of the theory of epsilon entropy. The notion of the entropy of a "data source" is defined. This quantity gives a precise measure of the amount of channel capacity necessary to describe a data source to within a given fidelity, epsilon, with probability one, when each separate "experiment" must be transmitted without storage from experiment to experiment. We also define the absolute epsilon entropy of a source, which is the amount of capacity needed when storage of experiments is allowed before transmission. The absolute epsilon entropy is shown to be equal to Shannon's rate distortion function evaluated for zero distortion, when suitable identifications are made. The main result is that the absolute epsilon entropy and the epsilon entropy have ratio close to one if either is large. Thus, very little can be saved by storing the results of independent experiments before transmission.

Citation

Download Citation

Edward C. Posner. Eugene R. Rodemich. "Epsilon Entropy and Data Compression." Ann. Math. Statist. 42 (6) 2079 - 2125, December, 1971. https://doi.org/10.1214/aoms/1177693077

Information

Published: December, 1971
First available in Project Euclid: 27 April 2007

zbMATH: 0232.94007
MathSciNet: MR297458
Digital Object Identifier: 10.1214/aoms/1177693077

Rights: Copyright © 1971 Institute of Mathematical Statistics

Vol.42 • No. 6 • December, 1971
Back to Top