Abstract
If $\{Y_n\}$ is a stationary ergodic Markov process taking on values in a finite set $\{1, 2, \cdots, A\}$, then its entropy can be calculated directly. If $\phi$ is a function defined on $1, 2, \cdots, A$, with values $1, 2, \cdots, D$, no comparable formula is available for the entropy of the process $\{Y_n = \phi(Y_n)\}$. However, the entropy of this functional process can be approximated by the monotonic functions $\bar{G}_n = h(X_n \mid X_{n-1}, \cdots, X_1)$ and $\underline{G}_n = h(X_n \mid X_{n-1}, \cdots, X_1, Y_0)$, the conditional entropies. Furthermore, if the underlying Markov process $\{Y_n\}$ has strictly positive transition probabilities, these two approximations converge exponentially to the entropy $H$, where the convergence is given by $0 \leqq \bar{G}_n - H \leqq B_\rho^{n-1}$ and $0 \leqq H - \underline{G}_n \leqq B\rho^{n-1}$ with $0 < \rho < 1, \rho$ being independent of the function $\phi$.
Citation
John J. Birch. "Approximations for the Entropy for Functions of Markov Chains." Ann. Math. Statist. 33 (3) 930 - 938, September, 1962. https://doi.org/10.1214/aoms/1177704462
Information