Abstract
This paper introduces the concept of epsilon-delta entropy for "probabilistic metric spaces." The concept arises in the study of efficient data transmission, in other words, in "Data Compression." In a case of particular interest, the space is the space of paths of a stochastic process, for example $L_2\lbrack 0, 1\rbrack$ under the probability distribution induced by a mean-continuous process on the unit interval. For any epsilon and delta both greater than zero, the epsilon-delta entropy of any probabilistic metric space is finite. However, when delta is zero, the resulting entropy, called simply the epsilon entropy of the space, can be infinite. We give a simple condition on the eigenvalues of a process on $L_2\lbrack 0, 1\rbrack$ such that any process satisfying that condition has finite epsilon entropy for any epsilon greater than zero. And, for any set of eigenvalues not satisfying the given condition, we produce a mean-continuous process on the unit interval having infinite epsilon entropy for every epsilon greater than zero. The condition is merely that $\sum n\sigma_n^2$ be finite, where $\sigma_1^2 \geqq \sigma_2^2 \geqq \cdots$ are the eigenvalues of the process.
Citation
Edward C. Posner. Eugene R. Rodemich. Howard Rumsey Jr.. "Epsilon Entropy of Stochastic Processes." Ann. Math. Statist. 38 (4) 1000 - 1020, August, 1967. https://doi.org/10.1214/aoms/1177698768
Information