Abstract
This article studies efficient data transmission, or "data compression", from the standpoint of the theory of epsilon entropy. The notion of the entropy of a "data source" is defined. This quantity gives a precise measure of the amount of channel capacity necessary to describe a data source to within a given fidelity, epsilon, with probability one, when each separate "experiment" must be transmitted without storage from experiment to experiment. We also define the absolute epsilon entropy of a source, which is the amount of capacity needed when storage of experiments is allowed before transmission. The absolute epsilon entropy is shown to be equal to Shannon's rate distortion function evaluated for zero distortion, when suitable identifications are made. The main result is that the absolute epsilon entropy and the epsilon entropy have ratio close to one if either is large. Thus, very little can be saved by storing the results of independent experiments before transmission.
Citation
Edward C. Posner. Eugene R. Rodemich. "Epsilon Entropy and Data Compression." Ann. Math. Statist. 42 (6) 2079 - 2125, December, 1971. https://doi.org/10.1214/aoms/1177693077
Information