We prove an exponential decay concentration inequality to bound the tail probability of the difference between the log-likelihood of discrete random variables on a finite alphabet and the negative entropy. The concentration bound we derive holds uniformly over all parameter values. The new result improves the convergence rate in an earlier result of Zhao (2020), from to , where n is the sample size and K is the size of the alphabet. We further prove that the rate is optimal. The result is extended to misspecified log-likelihoods for grouped random variables. We give applications of the new result in information theory.
This research was supported by the National Science Foundation grant DMS-1840203.
The author thanks the editor, the associate editor, and an anonymous referee for their constructive feedback.
"An optimal uniform concentration inequality for discrete entropies on finite alphabets in the high-dimensional setting." Bernoulli 28 (3) 1892 - 1911, August 2022. https://doi.org/10.3150/21-BEJ1403