The Annals of Probability

Concentration of the information in data with log-concave distributions

Sergey Bobkov and Mokshay Madiman

Full-text: Open access


A concentration property of the functional −log f(X) is demonstrated, when a random vector X has a log-concave density f on ℝn. This concentration property implies in particular an extension of the Shannon–McMillan–Breiman strong ergodic theorem to the class of discrete-time stochastic processes with log-concave marginals.

Article information

Ann. Probab. Volume 39, Number 4 (2011), 1528-1543.

First available in Project Euclid: 5 August 2011

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

Primary: 60G07: General theory of processes 94A15: Information theory, general [See also 62B10, 81P94]

Concentration entropy log-concave distributions asymptotic equipartition property Shannon–McMillan–Breiman theorem


Bobkov, Sergey; Madiman, Mokshay. Concentration of the information in data with log-concave distributions. Ann. Probab. 39 (2011), no. 4, 1528--1543. doi:10.1214/10-AOP592.

Export citation


  • [1] Algoet, P. H. and Cover, T. M. (1988). A sandwich proof of the Shannon–Mcmillan–Breiman theorem. Ann. Probab. 16 899–909.
  • [2] Barlow, R. E., Marshall, A. W. and Proschan, F. (1963). Properties of probability distributions with monotone hazard rate. Ann. Math. Statist. 34 375–389.
  • [3] Barron, A. R. (1985). The strong ergodic theorem for densities: Generalized Shannon–McMillan–Breiman theorem. Ann. Probab. 13 1292–1303.
  • [4] Bobkov, S. (1996). Extremal properties of half-spaces for log-concave distributions. Ann. Probab. 24 35–48.
  • [5] Bobkov, S. G. (2003). Spectral gap and concentration for some spherically symmetric probability measures. In Geometric Aspects of Functional Analysis. Lecture Notes in Math. 1807 37–43. Springer, Berlin.
  • [6] Bobkov, S. G. and Madiman, M. (2010). The entropy per coordinate of a random vector is highly constrained under convexity conditions. IEEE Trans. Inform. Theory. To appear. Available at
  • [7] Bobkov, S. G. and Madiman, M. (2010). When can one invert Hölder’s inequality? (and why one may want to). Preprint.
  • [8] Borell, C. (1973). Complements of Lyapunov’s inequality. Math. Ann. 205 323–331.
  • [9] Borell, C. (1974). Convex measures on locally convex spaces. Ark. Mat. 12 239–252.
  • [10] Breiman, L. (1957). The individual ergodic theorem of information theory. Ann. Math. Statist. 28 809–811. [See also the correction: Ann. Math. Statist. 31 (1960) 809–810.]
  • [11] Cover, T. M. and Pombra, S. (1989). Gaussian feedback capacity. IEEE Trans. Inform. Theory 35 37–43.
  • [12] Kannan, R., Lovász, L. and Simonovits, M. (1995). Isoperimetric problems for convex bodies and a localization lemma. Discrete Comput. Geom. 13 541–559.
  • [13] Karlin, S., Proschan, F. and Barlow, R. E. (1961). Moment inequalities of Pólya frequency functions. Pacific J. Math. 11 1023–1033.
  • [14] Kieffer, J. C. (1974). A simple proof of the Moy–Perez generalization of the Shannon–McMillan theorem. Pacific J. Math. 51 203–206.
  • [15] Klartag, B. and Milman, V. D. (2005). Geometry of log-concave functions and measures. Geom. Dedicata 112 169–182.
  • [16] Lovász, L. and Simonovits, M. (1993). Random walks in a convex body and an improved volume algorithm. Random Structures Algorithms 4 359–412.
  • [17] McMillan, B. (1953). The basic theorems of information theory. Ann. Math. Statist. 24 196–219.
  • [18] Moy, S.-t. C. (1961). Generalizations of Shannon–McMillan theorem. Pacific J. Math. 11 705–714.
  • [19] Orey, S. (1985). On the Shannon–Perez–Moy theorem. In Particle Systems, Random Media and Large Deviations (Brunswick, Maine, 1984). Contemporary Mathematics 41 319–327. Amer. Math. Soc., Providence, RI.
  • [20] Perez, A. (1964). Extensions of Shannon–McMillan’s limit theorem to more general stochastic processes. In Trans. Third Prague Conf. Information Theory, Statist. Decision Functions, Random Processes (Liblice, 1962) 545–574. Publ. House Czech. Acad. Sci., Prague.
  • [21] Shannon, C. E. (1948). A mathematical theory of communication. Bell System Tech. J. 27 379–423, 623–656.