Open Access
April 1996 Bounding $\bar{d}$-distance by informational divergence: a method to prove measure concentration
K. Marton
Ann. Probab. 24(2): 857-866 (April 1996). DOI: 10.1214/aop/1039639365

Abstract

There is a simple inequality by Pinsker between variational distance and informational divergence of probability measures defined on arbitrary probability spaces. We shall consider probability measures on sequences taken from countable alphabets, and derive, from Pinsker's inequality, bounds on the $\bar{d}$-distance by informational divergence. Such bounds can be used to prove the "concentration of measure" phenomenon for some nonproduct distributions.

Citation

Download Citation

K. Marton. "Bounding $\bar{d}$-distance by informational divergence: a method to prove measure concentration." Ann. Probab. 24 (2) 857 - 866, April 1996. https://doi.org/10.1214/aop/1039639365

Information

Published: April 1996
First available in Project Euclid: 11 December 2002

zbMATH: 0865.60017
MathSciNet: MR1404531
Digital Object Identifier: 10.1214/aop/1039639365

Subjects:
Primary: 60F10 , 60G05 , 60G70

Keywords: $\bar{d}$-distance , informational divergence , Isoperimetric inequality , Markov chains , measure concentration

Rights: Copyright © 1996 Institute of Mathematical Statistics

Vol.24 • No. 2 • April 1996
Back to Top