Bernoulli

  • Bernoulli
  • Volume 16, Number 2 (2010), 459-470.

Relative log-concavity and a pair of triangle inequalities

Yaming Yu

Full-text: Open access

Abstract

The relative log-concavity ordering ≤lc between probability mass functions (pmf’s) on non-negative integers is studied. Given three pmf’s f, g, h that satisfy f ≤lcg ≤lch, we present a pair of (reverse) triangle inequalities: if ∑iifi=∑iigi<∞, then

D(f|h)≥D(f|g)+D(g|h)

and if ∑iigi=∑iihi<∞, then

D(h|f)≥D(h|g)+D(g|f),

where D(⋅|⋅) denotes the Kullback–Leibler divergence. These inequalities, interesting in themselves, are also applied to several problems, including maximum entropy characterizations of Poisson and binomial distributions and the best binomial approximation in relative entropy. We also present parallel results for continuous distributions and discuss the behavior of ≤lc under convolution.

Article information

Source
Bernoulli Volume 16, Number 2 (2010), 459-470.

Dates
First available in Project Euclid: 25 May 2010

Permanent link to this document
https://projecteuclid.org/euclid.bj/1274821079

Digital Object Identifier
doi:10.3150/09-BEJ216

Mathematical Reviews number (MathSciNet)
MR2668910

Zentralblatt MATH identifier
1248.60028

Keywords
Bernoulli sum binomial approximation Hoeffding’s inequality maximum entropy minimum entropy negative binomial approximation Poisson approximation relative entropy

Citation

Yu, Yaming. Relative log-concavity and a pair of triangle inequalities. Bernoulli 16 (2010), no. 2, 459--470. doi:10.3150/09-BEJ216. https://projecteuclid.org/euclid.bj/1274821079


Export citation

References

  • [1] Barbour, A.D., Holst, L. and Janson, S. (1992). Poisson Approximation. Oxford Studies in Probability 2. Oxford: Clarendon Press.
  • [2] Barlow, R.E. and Proschan, F. (1975). Statistical Theory of Reliability and Life Testing. New York: Holt, Rinehart & Winston.
  • [3] Chen, L.H.Y. (1975). Poisson approximation for dependent trials. Ann. Probab. 3 534–545.
  • [4] Choi, K.P. and Xia, A. (2002). Approximating the number of successes in independent trials: Binomial versus Poisson. Ann. Appl. Probab. 12 1139–1148.
  • [5] Csiszár, I. and Shields, P. (2004). Information theory and statistics: A tutorial. Foundations and Trends in Communications and Information Theory 1 417–528.
  • [6] Davenport, H. and Pólya, G. (1949). On the product of two power series. Canad. J. Math. 1 1–5.
  • [7] Dharmadhikari, S. and Joag-Dev, K. (1988). Unimodality, Convexity, and Applications. New York: Academic Press.
  • [8] Ehm, W. (1991). Binomial approximation to the Poisson binomial distribution. Statist. Probab. Lett. 11 7–16.
  • [9] Hardy, G.H., Littlewood, J.E. and Pólya, G. (1964). Inequalities. Cambridge, UK: Cambridge Univ. Press.
  • [10] Harremoës, P. (2001). Binomial and Poisson distributions as maximum entropy distributions. IEEE Trans. Inform. Theory 47 2039–2041.
  • [11] Hoeffding, W. (1956). On the distribution of the number of successes in independent trials. Ann. Math. Statist. 27 713–721.
  • [12] Imhof, J.P. (1961). Computing the distribution of quadratic forms in normal variables. Biometrika 48 419–426.
  • [13] Johnson, O. (2007). Log-concavity and the maximum entropy property of the Poisson distribution. Stochastic Process. Appl. 117 791–802.
  • [14] Johnson, O., Kontoyiannis, I. and Madiman, M. (2008). On the entropy and log-concavity of compound Poisson measures. Preprint. Available at arXiv:0805.4112.
  • [15] Karlin, S. and Novikoff, A. (1963). Generalized convex inequalities. Pacific J. Math. 13 1251–1279.
  • [16] Karlin, S. and Rinott, Y. (1981). Entropy inequalities for classes of probability distributions I: The univariate case. Adv. in Appl. Probab. 13 93–112.
  • [17] Karlin, S. and Studden, W.J. (1966). Tchebycheff Systems: With Applications in Analysis and Statistics. New York: Interscience.
  • [18] Kullback, S. (1959). Information Theory and Statistics. New York: Wiley.
  • [19] Kullback, S. and Leibler, R.A. (1951). On information and sufficiency. Ann. Math. Statist. 22 79–86.
  • [20] Le Cam, L. (1960). An approximation theorem for the Poisson binomial distribution. Pacific J. Math. 10 1181–1197.
  • [21] Liggett, T.M. (1997). Ultra logconcave sequences and negative dependence. J. Combin. Theory Ser. A 79 315–325.
  • [22] Mateev, P. (1978). The entropy of the multinomial distribution. Teor. Veroyatn. Primen. 23 196–198.
  • [23] Pemantle, R. (2000). Towards a theory of negative dependence. J. Math. Phys. 41 1371–1390.
  • [24] Shaked, M. and Shanthikumar, J.G. (1994). Stochastic Orders and Their Applications. New York: Academic Press.
  • [25] Shepp, L.A. and Olkin, I. (1981). Entropy of the sum of independent Bernoulli random variables and of the multinomial distribution. In Contributions to Probability 201–206. New York: Academic Press.
  • [26] Stein, C. (1986). Approximate Computation of Expectations. IMS Monograph Series 7. Hayward, CA: Inst. Math. Statist.
  • [27] Whitt, W. (1985). Uniform conditional variability ordering of probability distributions. J. Appl. Probab. 22 619–633.
  • [28] Yu, Y. (2008). On the maximum entropy properties of the binomial distribution. IEEE Trans. Inform. Theory 54 3351–3353.
  • [29] Yu, Y. (2008). On an inequality of Karlin and Rinott concerning weighted sums of i.i.d. random variables. Adv. in Appl. Probab. 40 1223–1226.
  • [30] Yu, Y. (2009). Stochastic ordering of exponential family distributions and their mixtures. J. Appl. Probab. 46 244–254.
  • [31] Yu, Y. (2009). On the entropy of compound distributions on nonnegative integers. IEEE Trans. Inform. Theory. 55 3645–3650.