August 2022 An optimal uniform concentration inequality for discrete entropies on finite alphabets in the high-dimensional setting
Yunpeng Zhao
Author Affiliations +
Bernoulli 28(3): 1892-1911 (August 2022). DOI: 10.3150/21-BEJ1403

Abstract

We prove an exponential decay concentration inequality to bound the tail probability of the difference between the log-likelihood of discrete random variables on a finite alphabet and the negative entropy. The concentration bound we derive holds uniformly over all parameter values. The new result improves the convergence rate in an earlier result of Zhao (2020), from (K2logK)n=o(1) to (logK)2n=o(1), where n is the sample size and K is the size of the alphabet. We further prove that the rate (logK)2n=o(1) is optimal. The result is extended to misspecified log-likelihoods for grouped random variables. We give applications of the new result in information theory.

Funding Statement

This research was supported by the National Science Foundation grant DMS-1840203.

Acknowledgements

The author thanks the editor, the associate editor, and an anonymous referee for their constructive feedback.

Citation

Download Citation

Yunpeng Zhao. "An optimal uniform concentration inequality for discrete entropies on finite alphabets in the high-dimensional setting." Bernoulli 28 (3) 1892 - 1911, August 2022. https://doi.org/10.3150/21-BEJ1403

Information

Received: 1 July 2020; Published: August 2022
First available in Project Euclid: 25 April 2022

MathSciNet: MR4411515
zbMATH: 1489.60035
Digital Object Identifier: 10.3150/21-BEJ1403

Keywords: concentration inequality , Entropy , log-likelihood , non-convex optimization , source coding theorem , typical set

JOURNAL ARTICLE
20 PAGES

This article is only available to subscribers.
It is not available for individual sale.
+ SAVE TO MY LIBRARY

Vol.28 • No. 3 • August 2022
Back to Top