Open Access
2020 Concentration of information content for convex measures
Matthieu Fradelizi, Jiange Li, Mokshay Madiman
Electron. J. Probab. 25: 1-22 (2020). DOI: 10.1214/20-EJP416

Abstract

We establish sharp exponential deviation estimates of the information content as well as a sharp bound on the varentropy for the class of convex measures on Euclidean spaces. This generalizes a similar development for log-concave measures in the recent work of Fradelizi, Madiman and Wang (2016). In particular, our results imply that convex measures in high dimension are concentrated in an annulus between two convex sets (as in the log-concave case) despite their possibly having much heavier tails. Various tools and consequences are developed, including a sharp comparison result for Rényi entropies, inequalities of Kahane-Khinchine type for convex measures that extend those of Koldobsky, Pajor and Yaskin (2008) for log-concave measures, and an extension of Berwald’s inequality (1947).

Citation

Download Citation

Matthieu Fradelizi. Jiange Li. Mokshay Madiman. "Concentration of information content for convex measures." Electron. J. Probab. 25 1 - 22, 2020. https://doi.org/10.1214/20-EJP416

Information

Received: 10 February 2019; Accepted: 10 January 2020; Published: 2020
First available in Project Euclid: 6 February 2020

zbMATH: 1445.60028
MathSciNet: MR4073681
Digital Object Identifier: 10.1214/20-EJP416

Subjects:
Primary: 60F10 , 62B10

Keywords: Concentration , convex measures , Entropy , information content , log-concave

Vol.25 • 2020
Back to Top