Open Access
May, 1985 Mutual Dependence of Random Variables and Maximum Discretized Entropy
Carlo Bertoluzza, Bruno Forte
Ann. Probab. 13(2): 630-637 (May, 1985). DOI: 10.1214/aop/1176993016

Abstract

In connection with a random vector $(X, Y)$ in the unit square $Q$ and a couple $(m, n)$ of positive integers, we consider all discretizations of the continuous probability distribution of $(X, Y)$ that are obtained by an $m \times n$ cartesian decomposition of $Q$. We prove that $Y$ is a (continuous and invertible) function of $X$ if and only if for each $m, n$ the maximum entropy of the finite distributions equals $\log(m + n - 1)$

Citation

Download Citation

Carlo Bertoluzza. Bruno Forte. "Mutual Dependence of Random Variables and Maximum Discretized Entropy." Ann. Probab. 13 (2) 630 - 637, May, 1985. https://doi.org/10.1214/aop/1176993016

Information

Published: May, 1985
First available in Project Euclid: 19 April 2007

zbMATH: 0563.60023
MathSciNet: MR781430
Digital Object Identifier: 10.1214/aop/1176993016

Subjects:
Primary: 60E99
Secondary: 62B10

Keywords: 62-07 , discretization , Entropy , mutual dependence

Rights: Copyright © 1985 Institute of Mathematical Statistics

Vol.13 • No. 2 • May, 1985
Back to Top