The problem of obtaining a satisfactory measure of association between two random variables is closely allied to that of obtaining a measure of the amount of information about one contained in the other. For the more closely associated are the random variables the more information about one ought to be given by an observation on the other and vice versa. It is not, therefore, surprising to find that there have been several suggestions for basing coefficients of association on the now celebrated measure of information introduced by Shannon  in the context of communication theory. (See Bell  for certain of these and for references to others). Now Shannon's measure of information was based on the notion of entropy which seems to be much more meaningful for finite probability spaces than it is for infinite spaces, and while Gel'fand and Yaglom  have suggested a generalisation of Shannon's measure for infinite spaces, there remain difficulties, as indicated by Bell , about deriving from it coefficients of association or dependence between random variables taking infinite sets of values. In the present paper, by adopting a slightly different attitude to information from that of communication theory, we shall obtain a general measure of information which yields a fairly natural coefficient of dependence between two continuous random variables or, more generally, between two non-atomic measures. The next section provides the motivation for the introduction of this measure of information and a general definition is given in Section 3. In Section 4 we discuss some of the properties of this measure regarded as a coefficient of association along the lines suggested by Renyi . Finally, in Section 5, we indicate the relevance of this measure to estimation theory.
"On a Measure of Association." Ann. Math. Statist. 35 (3) 1157 - 1166, September, 1964. https://doi.org/10.1214/aoms/1177703273