The present paper is partly expository and does not assume any previous knowledge of information theory or coding theory. It is meant to be the first in a series of papers on coding theory for noisy channels, this series replacing the report  which was widely circulated. A few of the results were reported in ,  and . In Sections 2 and 3 we present certain refinements and generalizations of known methods of Shannon, Fano, Feinstein and Gallager. A discussion of some other methods may be found in the surveys by Kotz  and Wolfowitz  such as those of Khintchine , McMillan  and Wolfowitz , . Section 4 contains a number of applications. Most of this section is devoted to a certain memoryless channel with additive noise. Some of the proofs have been collected in Section 5. Finally, Section 6 describes some new results on the relative entropy $H(\mu_1\mid\mu_2),$ of one measure $\mu_1$ relative to another measure $\mu_2$, and its relation with the total variation $\|\mu_1 - \mu_2\|$.
J. H. B. Kemperman. "On the Optimum Rate of Transmitting Information." Ann. Math. Statist. 40 (6) 2156 - 2177, December, 1969. https://doi.org/10.1214/aoms/1177697293