The Annals of Mathematical Statistics

Epsilon Entropy and Data Compression

Edward C. Posner and Eugene R. Rodemich

Full-text: Open access

Abstract

This article studies efficient data transmission, or "data compression", from the standpoint of the theory of epsilon entropy. The notion of the entropy of a "data source" is defined. This quantity gives a precise measure of the amount of channel capacity necessary to describe a data source to within a given fidelity, epsilon, with probability one, when each separate "experiment" must be transmitted without storage from experiment to experiment. We also define the absolute epsilon entropy of a source, which is the amount of capacity needed when storage of experiments is allowed before transmission. The absolute epsilon entropy is shown to be equal to Shannon's rate distortion function evaluated for zero distortion, when suitable identifications are made. The main result is that the absolute epsilon entropy and the epsilon entropy have ratio close to one if either is large. Thus, very little can be saved by storing the results of independent experiments before transmission.

Article information

Source
Ann. Math. Statist., Volume 42, Number 6 (1971), 2079-2125.

Dates
First available in Project Euclid: 27 April 2007

Permanent link to this document
https://projecteuclid.org/euclid.aoms/1177693077

Digital Object Identifier
doi:10.1214/aoms/1177693077

Mathematical Reviews number (MathSciNet)
MR297458

Zentralblatt MATH identifier
0232.94007

JSTOR
links.jstor.org

Citation

Posner, Edward C.; Rodemich, Eugene R. Epsilon Entropy and Data Compression. Ann. Math. Statist. 42 (1971), no. 6, 2079--2125. doi:10.1214/aoms/1177693077. https://projecteuclid.org/euclid.aoms/1177693077


Export citation