The Annals of Probability
- Ann. Probab.
- Volume 24, Number 2 (1996), 857-866.
Bounding $\bar{d}$-distance by informational divergence: a method to prove measure concentration
Abstract
There is a simple inequality by Pinsker between variational distance and informational divergence of probability measures defined on arbitrary probability spaces. We shall consider probability measures on sequences taken from countable alphabets, and derive, from Pinsker's inequality, bounds on the $\bar{d}$-distance by informational divergence. Such bounds can be used to prove the "concentration of measure" phenomenon for some nonproduct distributions.
Article information
Source
Ann. Probab. Volume 24, Number 2 (1996), 857-866.
Dates
First available in Project Euclid: 11 December 2002
Permanent link to this document
http://projecteuclid.org/euclid.aop/1039639365
Digital Object Identifier
doi:10.1214/aop/1039639365
Mathematical Reviews number (MathSciNet)
MR1404531
Zentralblatt MATH identifier
0865.60017
Subjects
Primary: 60F10: Large deviations 60G70: Extreme value theory; extremal processes 60G05: Foundations of stochastic processes
Keywords
Measure concentration isoperimetric inequality Markov chains $\bar{d}$-distance informational divergence
Citation
Marton, K. Bounding $\bar{d}$-distance by informational divergence: a method to prove measure concentration. Ann. Probab. 24 (1996), no. 2, 857--866. doi:10.1214/aop/1039639365. http://projecteuclid.org/euclid.aop/1039639365.

