Open Access
September, 1962 Approximations for the Entropy for Functions of Markov Chains
John J. Birch
Ann. Math. Statist. 33(3): 930-938 (September, 1962). DOI: 10.1214/aoms/1177704462


If $\{Y_n\}$ is a stationary ergodic Markov process taking on values in a finite set $\{1, 2, \cdots, A\}$, then its entropy can be calculated directly. If $\phi$ is a function defined on $1, 2, \cdots, A$, with values $1, 2, \cdots, D$, no comparable formula is available for the entropy of the process $\{Y_n = \phi(Y_n)\}$. However, the entropy of this functional process can be approximated by the monotonic functions $\bar{G}_n = h(X_n \mid X_{n-1}, \cdots, X_1)$ and $\underline{G}_n = h(X_n \mid X_{n-1}, \cdots, X_1, Y_0)$, the conditional entropies. Furthermore, if the underlying Markov process $\{Y_n\}$ has strictly positive transition probabilities, these two approximations converge exponentially to the entropy $H$, where the convergence is given by $0 \leqq \bar{G}_n - H \leqq B_\rho^{n-1}$ and $0 \leqq H - \underline{G}_n \leqq B\rho^{n-1}$ with $0 < \rho < 1, \rho$ being independent of the function $\phi$.


Download Citation

John J. Birch. "Approximations for the Entropy for Functions of Markov Chains." Ann. Math. Statist. 33 (3) 930 - 938, September, 1962.


Published: September, 1962
First available in Project Euclid: 27 April 2007

zbMATH: 0109.36301
MathSciNet: MR141162
Digital Object Identifier: 10.1214/aoms/1177704462

Rights: Copyright © 1962 Institute of Mathematical Statistics

Vol.33 • No. 3 • September, 1962
Back to Top