The Annals of Probability

Rényi divergence and the central limit theorem

S. G. Bobkov, G. P. Chistyakov, and F. Götze

Full-text: Access denied (no subscription detected)

We're sorry, but we are unable to provide you with the full text of this article because we are not able to identify you as a subscriber. If you have a personal subscription to this journal, then please login. If you are already logged in, then you may need to update your profile to register your subscription. Read more about accessing full-text


We explore properties of the $\chi^{2}$ and Rényi distances to the normal law and in particular propose necessary and sufficient conditions under which these distances tend to zero in the central limit theorem (with exact rates with respect to the increasing number of summands).

Article information

Ann. Probab., Volume 47, Number 1 (2019), 270-323.

Received: October 2016
Revised: January 2018
First available in Project Euclid: 13 December 2018

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

Primary: 60E 60F15: Strong theorems

$\chi^{2}$-divergence Rényi and Tsallis entropies central limit theorem


Bobkov, S. G.; Chistyakov, G. P.; Götze, F. Rényi divergence and the central limit theorem. Ann. Probab. 47 (2019), no. 1, 270--323. doi:10.1214/18-AOP1261.

Export citation


  • [1] Amosova, N. N. (1990). Narrow zones of local normal attraction. Teor. Veroyatn. Primen. 35 138–143. Translation in: Theory Probab. Appl. 35 (1990) 140–145 (1991).
  • [2] Amosova, N. N. (1990). A remark on a local limit theorem for large deviations. Teor. Veroyatn. Primen. 35 754–756. Translation in: Theory Probab. Appl. 35 (1990), 758–760 (1991).
  • [3] Artstein, S., Ball, K. M., Barthe, F. and Naor, A. (2004). On the rate of convergence in the entropic central limit theorem. Probab. Theory Related Fields 129 381–390.
  • [4] Artstein, S., Ball, K. M., Barthe, F. and Naor, A. (2004). Solution of Shannon’s problem on the monotonicity of entropy. J. Amer. Math. Soc. 17 975–982.
  • [5] Bally, V. and Caramellino, L. (2016). Asymptotic development for the CLT in total variation distance. Bernoulli 22 2442–2485.
  • [6] Barron, A. R. (1986). Entropy and the central limit theorem. Ann. Probab. 14 336–342.
  • [7] Bhattacharya, R. N. and Ranga Rao, R. (1976). Normal Approximation and Asymptotic Expansions. Wiley, New York.
  • [8] Bobkov, S. G., Chistyakov, G. P. and Götze, F. (2011). Non-uniform bounds in local limit theorems in case of fractional moments. I. Math. Methods Statist. 20 171–191.
  • [9] Bobkov, S. G., Chistyakov, G. P. and Götze, F. (2013). Rate of convergence and Edgeworth-type expansion in the entropic central limit theorem. Ann. Probab. 41 2479–2512.
  • [10] Bobkov, S. G., Chistyakov, G. P. and Götze, F. (2014). Berry–Esseen bounds in the entropic central limit theorem. Probab. Theory Related Fields 159 435–478.
  • [11] Bobkov, S. G., Chistyakov, G. P. and Götze, F. (2014). Fisher information and the central limit theorem. Probab. Theory Related Fields 159 1–59.
  • [12] Bobkov, S. G., Chistyakov, G. P. and Kösters, H. (2015). The entropic Erdös–Kac limit theorem. J. Theoret. Probab. 28 1520–1555.
  • [13] Bobkov, S. G. and Götze, F. (1999). Exponential integrability and transportation cost related to logarithmic Sobolev inequalities. J. Funct. Anal. 163 1–28.
  • [14] Bobkov, S. G., Houdré, C. and Tetali, P. (2006). The subgaussian constant and concentration inequalities. Israel J. Math. 156 255–283.
  • [15] Borland, L., Plastino, A. R. and Tsallis, C. (1998). Information gain within nonextensive thermostatistics. J. Math. Phys. 39 6490–6501.
  • [16] Carlen, E. A. (1991). Superadditivity of Fisher’s information and logarithmic Sobolev inequalities. J. Funct. Anal. 101 194–211.
  • [17] Cramér, H. (1925). On some classes of series used in mathematical statistics. In Proc. 6th Scand. Math. Congr. Copenhagen 399–425. Also: Harald Cramér (1994) Collected Works, Vol. I (A. Martin-Löf, ed.) 438–464. Springer, Berlin.
  • [18] Csiszár, I. (1967). Information-type measures of difference of probability distributions and indirect observations. Studia Sci. Math. Hungar. 2 299–318.
  • [19] Dembo, A., Cover, T. M. and Thomas, J. A. (1991). Information-theoretic inequalities. IEEE Trans. Inform. Theory 37 1501–1518.
  • [20] Fomin, S. V. (1982). The central limit theorem: Convergence in the norm $\|u\|=(\int_{-\infty}^{\infty}u^{2}(x)e^{x^{2}/2}\,dx)^{1/2}$. Zap. Nauchn. Sem. Leningrad. Otdel. Mat. Inst. Steklov. (LOMI) 119 218–229, 242, 245. Problems of the theory of probability distribution, VII.
  • [21] Gibbs, A. L. and Su, F. E. (2002). On choosing and bounding probability metrics. Int. Stat. Rev. 70 419–435.
  • [22] Gilardoni, G. L. (2010). On Pinsker’s and Vajda’s type inequalities for Csiszár’s $f$-divergences. IEEE Trans. Inform. Theory 56 5377–5386.
  • [23] Hirschman, I. I. and Widder, D. V. (1955). The Convolution Transform. Princeton Univ. Press, Princeton, NJ.
  • [24] Ibragimov, I. A. and Linnik, Yu. V. (1971). Independent and Stationary Sequences of Random Variables. Wolters-Noordhoff Publishing, Groningen.
  • [25] Johnson, O. (2004). Information Theory and the Central Limit Theorem. Imperial College Press, London.
  • [26] Johnson, O. and Barron, A. (2004). Fisher information inequalities and the central limit theorem. Probab. Theory Related Fields 129 391–409.
  • [27] Kullback, S. (1967). A lower bound for discrimination in terms of variation. IEEE Trans. Inform. Theory 13 126–127.
  • [28] Kullback, S. and Leibler, R. A. (1951). On information and sufficiency. Ann. Math. Stat. 22 79–86.
  • [29] Le Cam, L. (1986). Asymptotic Methods in Statistical Decision Theory. Springer, New York.
  • [30] Lieb, E. H. (1975). Some convexity and subadditivity properties of entropy. Bull. Amer. Math. Soc. 81 1–13.
  • [31] Liese, F. and Vajda, I. (1987). Convex Statistical Distances. Teubner-Texte zur Mathematik [Teubner Texts in Mathematics] 95. BSB B. G. Teubner Verlagsgesellschaft, Leipzig.
  • [32] Linnik, Ju. V. (1959). An information-theoretic proof of the central limit theorem with Lindeberg conditions. Theory Probab. Appl. 4 288–299.
  • [33] Madiman, M. and Barron, A. (2007). Generalized entropy power inequalities and monotonicity properties of information. IEEE Trans. Inform. Theory 53 2317–2329.
  • [34] Nielsen, F. (2014). On the Chi square and higher-order Chi distances for approximating $f$-divergences. IEEE Signal Process. Lett. 21 10–13.
  • [35] Otto, F. and Villani, C. (2000). Generalization of an inequality by Talagrand and links with the logarithmic Sobolev inequality. J. Funct. Anal. 173 361–400.
  • [36] Petrov, V. V. (1964). Local limit theorems for sums of independent random variables. Teor. Veroyatn. Primen. 9 343–352.
  • [37] Petrov, V. V. (1975). Sums of Independent Random Variables. Ergebnisse der Mathematik und ihrer Grenzgebiete 82. Springer, New York.
  • [38] Pinsker, M. S. (1964). Information and Information Stability of Random Variables and Processes. Holden-Day, Inc., San Francisco, CA. Translated and edited by Amiel Feinstein.
  • [39] Prokhorov, Yu. V. (1952). A local theorem for densities. Dokl. Akad. Nauk SSSR (N.S.) 83 797–800.
  • [40] Rényi, A. (1961). On measures of entropy and information. In Proc. 4th Berkeley Sympos. Math. Statist. and Prob., Vol. I 547–561. Univ. California Press, Berkeley, CA.
  • [41] Shiryaev, A. N. (1996). Probability, 2nd ed. Graduate Texts in Mathematics 95. Springer, New York. Translated from the first (1980) Russian edition by R. P. Boas.
  • [42] Siraždinov, S. H. and Mamatov, M. (1962). On mean convergence for densities. Theory Probab. Appl. 7 424–428.
  • [43] Szegö, G. Orhtogonal Polynomials, 3rd ed. American Math. Soc. Publications 23. Amer. Math. Soc., Providence, RI.
  • [44] Toscani, G. (2016). Entropy inequalities for stable densities and strengthened central limit theorems. J. Stat. Phys. 165 371–389.
  • [45] Toscani, G. (2016). The fractional Fisher information and the central limit theorem for stable laws. Ric. Mat. 65 71–91.
  • [46] Tsallis, C. (1998). Generalized entropy-biased criterion for consistent testing. Phys. Rev. E 58 1442–1445.
  • [47] Tulino, A. M. and Verdú, S. (2006). Monotonic decrease of the non-Gaussianness of the sum of independent random variables: A simple proof. IEEE Trans. Inform. Theory 52 4295–4297.
  • [48] Vajda, I. (1989). Theory of Statistical Inference and Information. Kluwer Academic, Dordrecht.
  • [49] van Erven, T. and Harremoës, P. (2014). Rényi divergence and Kullback–Leibler divergence. IEEE Trans. Inform. Theory 60 3797–3820.