Annales de l'Institut Henri Poincaré, Probabilités et Statistiques

Existence of Stein kernels under a spectral gap, and discrepancy bounds

Thomas A. Courtade, Max Fathi, and Ashwin Pananjady

Full-text: Access denied (no subscription detected)

We're sorry, but we are unable to provide you with the full text of this article because we are not able to identify you as a subscriber. If you have a personal subscription to this journal, then please login. If you are already logged in, then you may need to update your profile to register your subscription. Read more about accessing full-text

Abstract

We establish existence of Stein kernels for probability measures on $\mathbb{R}^{d}$ satisfying a Poincaré inequality, and obtain bounds on the Stein discrepancy of such measures. Applications to quantitative central limit theorems are discussed, including a new central limit theorem in the Kantorovich–Wasserstein distance $W_{2}$ with optimal rate and dependence on the dimension. As a byproduct, we obtain a stable version of an estimate of the Poincaré constant of probability measures under a second moment constraint. The results extend more generally to the setting of converse weighted Poincaré inequalities. The proof is based on simple arguments of functional analysis.

Further, we establish two general properties enjoyed by the Stein discrepancy, holding whenever a Stein kernel exists: Stein discrepancy is strictly decreasing along the CLT, and it controls the third moments of a random vector.

Résumé

Nous prouvons l’existence de noyaux de Stein pour les mesures de probabilités sur $\mathbb{R}^{d}$ satisfaisant une inégalité de Poincaré, et obtenons des bornes sur la discrépance de Stein de telles mesures. Des applications au théorème central limite sont données, dont une nouvelle borne sur la vitesse de convergence en distance de Kantorovitch–Wasserstein $W_{2}$ avec un taux et une dépendance en la dimension optimales. Comme corollaire, nous obtenons une version quantitative d’une borne sur la constante de Poincaré de mesures de probabilités satisfaisant une contrainte sur le moment d’ordre 2. Les résultats sont plus généralement valides dans le cadre de mesures vérifiant une inégalité de Poincaré à poids inversée. La preuve est basée sur des arguments simples d’analyse fonctionnelle.

De plus, nous démontrons deux propriétés générales sur la discrépance de Stein, valide dès lors qu’un noyau de Stein existe : la discrépance de Stein est strictement décroissante le long du TCL, et elle contrôle le moment d’ordre 3 d’un vecteur aléatoire.

Article information

Source
Ann. Inst. H. Poincaré Probab. Statist., Volume 55, Number 2 (2019), 777-790.

Dates
Received: 14 April 2017
Revised: 26 February 2018
Accepted: 15 March 2018
First available in Project Euclid: 14 May 2019

Permanent link to this document
https://projecteuclid.org/euclid.aihp/1557820831

Digital Object Identifier
doi:10.1214/18-AIHP898

Subjects
Primary: 60F05: Central limit and other weak theorems
Secondary: 60B10: Convergence of probability measures 60E15: Inequalities; stochastic orderings

Keywords
Stein kernels Quantitative central limit theorems Poincaré inequalities

Citation

Courtade, Thomas A.; Fathi, Max; Pananjady, Ashwin. Existence of Stein kernels under a spectral gap, and discrepancy bounds. Ann. Inst. H. Poincaré Probab. Statist. 55 (2019), no. 2, 777--790. doi:10.1214/18-AIHP898. https://projecteuclid.org/euclid.aihp/1557820831


Export citation

References

  • [1] H. Airault, P. Malliavin and F. Viens. Stokes formula on the Wiener space and $n$-dimensional Nourdin–Peccati analysis. J. Funct. Anal. 258 (5) (2010) 1763–1783.
  • [2] S. Artstein, K. Ball, F. Barthe and A. Naor. On the rate of convergence in the entropic central limit theorem. Probab. Theory Related Fields 129 (2004) 381–390.
  • [3] S. Artstein, K. Ball, F. Barthe and A. Naor. Solution of Shannon’s problem on the monotonicity of entropy. J. Amer. Math. Soc. 17 (2004) 975–982.
  • [4] E. Azmoodeh, S. Campese and G. Poly. Fourth moment theorems for Markov diffusion generators. J. Funct. Anal. 266 (4) (2013) 2341–2359.
  • [5] D. Bakry, F. Barthe, P. Cattiaux and A. Guillin. A simple proof of the Poincaré inequality in a large class of probability measures including log-concave cases. Electron. Commun. Probab. 13 (2008) 60–66.
  • [6] D. Bakry, I. Gentil and M. Ledoux. Analysis and Geometry of Markov Diffusion Operators. Grundlehren der Mathematischen Wissenschaften 348. Springer, Cham, 2014.
  • [7] K. Ball, F. Barthe and A. Naor. Entropy jumps in the presence of a spectral gap. Duke Math. J. 119 (1) (2003) 41–63.
  • [8] K. Ball and V. H. Nguyen. Entropy jumps for isotropic log-concave random vectors and spectral gap. Studia Math. 213 (1) (2012) 81–96.
  • [9] A. D. Barbour. Stein’s method for diffusion approximations. Probab. Theory Related Fields 84 (3) (1990) 297–322.
  • [10] J. B. Bardet, N. Gozlan, F. Malrieu and P. A. Zitt. Functional inequalities for Gaussian convolutions of compactly supported measures: Explicit bounds and dimension dependence. Bernoulli 24 (1) (2018) 333–353.
  • [11] S. Bobkov and M. Ledoux. Weighted Poincaré-type inequalities for Cauchy and other convex measures. Ann. Probab. 37 (2009) 403–427.
  • [12] S. G. Bobkov. Entropic approach to E. Rio’s central limit theorem for $W_{2}$ transport distance. Statist. Probab. Lett. 83 (7) (2013) 1644–1648.
  • [13] S. G. Bobkov, G. P. Chistyakov and F. Gotze. Entropic instability of Cramer’s characterization of the normal law. In Selected Works of Willem van Zwet 231–242. Sel. Works Probab. Stat. Springer, New York, 2012.
  • [14] S. G. Bobkov, G. P. Chistyakov and F. Götze. Rate of convergence and Edgeworth-type expansion in the entropic central limit theorem. Ann. Probab. 41 (4) (2013) 2479–2512.
  • [15] S. G. Bobkov, G. P. Chistyakov and F. Götze. Berry–Esseen bounds in the entropic central limit theorem. Probab. Theory Related Fields 159 (2014) 435–478.
  • [16] T. Bonis. Rates in the Central Limit Theorem and diffusion approximation via Stein’s Method. Arxiv preprint, 2016.
  • [17] T. Bonis. Personal communication.
  • [18] A. A. Borovkov and S. A. Utev. On an inequality and a related characterisation of the normal distribution. Theory Probab. Appl. 28 (1984) 219–228.
  • [19] L. Brasco and A. Pratelli. Sharp stability of some spectral inequalities. Geom. Funct. Anal. 22 (1) (2012) 107–135.
  • [20] Y. Brenier. Polar factorization and monotone rearrangement of vector-valued functions. Comm. Pure Appl. Math. 44 (4) (1991) 375–417.
  • [21] T. Cacoullos, V. Papathanasiou and S. A. Utev. Variational inequalities with examples and an application to the central limit theorem. Ann. Probab. 22 (03) (1994) 1607–1618.
  • [22] P. Cattiaux, N. Gozlan, A. Guillin and C. Roberto. Functional inequalities for heavy tails distributions and application to isoperimetry. Electron. J. Probab. 15 (2010) 346–385.
  • [23] S. Chatterjee. Fluctuations of eigenvalues and second order Poincaré inequalities. Probab. Theory Related Fields 143 (2009) 1–40.
  • [24] L. H. Y. Chen and J. Lou. Characterization of probability distributions by Poincaré-type inequalities. Ann. Inst. Henri Poincaré Probab. Stat. 23 (1) (1987) 91–110.
  • [25] T. A. Courtade. Monotonicity of entropy and Fisher information: A quick proof via maximal correlation. Commun. Inf. Syst. 16 (2) (2016) 111–115.
  • [26] H. Cramér. Ueber eine Eigenschaft der normalen Verteilungsfunktion. Math. Z. 41 (1936) 405–414.
  • [27] G. De Philippis and A. Figalli. Rigidity and stability of Caffarelli’s log-concave perturbation theorem. Nonlinear Anal. 154 (2017) 59–70.
  • [28] A. Dembo, A. Kagan and L. A. Shepp. Remarks on the maximum correlation coefficient. Bernoulli 7 (2) (2001) 343–350.
  • [29] P. Diaconis. Stein’s method for Markov chains: First examples. IMS Lecture Notes Monogr. Ser. 46 (2004) 26–41.
  • [30] F. Götze. On the rate of convergence in the multivariate CLT. Ann. Probab. 19 (1991) 724–739.
  • [31] O. Johnson. Information Theory and the Central Limit Theorem, 8. Imperial College Press, London, 2004.
  • [32] O. Johnson and A. Barron. Fisher information inequalities and the central limit theorem. Probab. Theory Related Fields 129 (3) (2004) 391–409.
  • [33] S. Kamath and C. Nair. The strong data processing constant for sums of iid random variables. In Proceedings of the 2015 IEEE International Symposium on Information Theory, Hong Kong, 2015.
  • [34] P. D. Lax and A. N. Milgram. Parabolic equations. In Contributions to the Theory of Partial Differential Equations 167–190. Annals of Mathematics Studies 33. Princeton University Press, Princeton, NJ, 1954.
  • [35] M. Ledoux, I. Nourdin and G. Peccati. Stein’s method, logarithmic Sobolev and transport inequalities. Geom. Funct. Anal. 25 (2015) 256–306.
  • [36] C. Ley, G. Reinert and Y. Swan. Approximate computation of expectations: A canonical Stein operator. Probability Surveys (2017). To appear.
  • [37] M. Madiman and A. R. Barron. The monotonicity of information in the central limit theorem and entropy power inequalities. In Proceedings of the 2006 IEEE International Symposium on Information Theory, Seattle, Washington, 2006.
  • [38] M. Madiman and A. R. Barron. Generalized entropy power inequalities and monotonicity properties of information. IEEE Trans. Inform. Theory 53 (7) (2007) 2317–2329.
  • [39] M. Madiman and F. Ghassemi. The entropy power of a sum is fractionally superadditive. In Proceedings of the 2009 IEEE International Symposium on Information Theory, Seoul, Korea, 2009.
  • [40] E. Milman. On the role of convexity in isoperimetry, spectral gap and concentration. Invent. Math. 177 (2009) 1–43.
  • [41] I. Nourdin and G. Peccati. Normal Approximations with Malliavin Calculus: From Stein’s Method to Universality. Cambridge Tracts in Mathematics. Cambridge University Press, Cambridge, 2012.
  • [42] I. Nourdin, G. Peccati and A. Réveillac. Multivariate normal approximation using Stein’s method and Malliavin calculus. Ann. Inst. Henri Poincaré Probab. Stat. 46 (1) (2010) 45–58.
  • [43] I. Nourdin, G. Peccati and Y. Swan. Entropy and the fourth moment phenomenon. J. Funct. Anal. 266 (5) (2014) 3170–3207.
  • [44] I. Nourdin, G. Peccati and Y. Swan. Integration by parts and representation of information functionals. In Proceedings of the 2014 IEEE International Symposium on Information Theory (ISIT) 2217–2221. Honolulu, HI, 2014.
  • [45] E. Rio. Upper bounds for minimal distances in the central limit theorem. Ann. Inst. Henri Poincaré Probab. Stat. 45 (3) (2009) 802–817.
  • [46] E. Rio. Asymptotic constants for minimal distance in the central limit theorem. Electron. Commun. Probab. 16 (9) (2011) 96–103.
  • [47] N. Ross. Fundamentals of Stein’s method. Probab. Surv. 8 (2011) 210–293.
  • [48] D. Shlyakhtenko. Shannon’s monotonicity problem for free and classical entropy. Proc. Natl. Acad. Sci. USA 104 (39) (2007) 15254–15258.
  • [49] C. Stein. A bound for the error in the normal approximation to the distribution of a sum of dependent random variables. In Proceedings of the Sixth Berkeley Symposium on Mathematical Statistics and Probability 583–602. Univ. California, Berkeley, CA, 1970/1971. Vol. II: Probability Theory. Univ. California Press, Berkeley, CA, 1972.
  • [50] C. Stein. Approximate Computation of Expectations. Institute of Mathematical Statistics Lecture Notes – Monograph Series 7. Institute of Mathematical Statistics, Hayward, CA, 1986.
  • [51] M. Talagrand. Transportation cost for Gaussian and other product measures. Geom. Funct. Anal. 6 (1996) 587–600.
  • [52] A. M. Tulino and S. Verdú. Monotonic decrease of the non-Gaussianness of the sum of independent random variables: A simple proof. IEEE Trans. Inform. Theory 52 (9) (2006) 4295–4297.
  • [53] C. Villani. Topics in Optimal Transportation. Graduate Studies in Mathematics 58, 2003.
  • [54] C. Villani. Optimal Transport, Old and New. Grundlehren der Mathematischen Wissenschaften 338, 2009.
  • [55] A. Zhai. A multivariate CLT in Wasserstein distance with near optimal convergence rate. Probab. Theory Related Fields 170 (3–4) (2018) 821–845.