## The Annals of Applied Probability

### Comparison inequalities and fastest-mixing Markov chains

#### Abstract

We introduce a new partial order on the class of stochastically monotone Markov kernels having a given stationary distribution $\pi$ on a given finite partially ordered state space ${ \mathcal{X}}$. When $K\preceq L$ in this partial order we say that $K$ and $L$ satisfy a comparison inequality. We establish that if $K_{1},\ldots,K_{t}$ and $L_{1},\ldots,L_{t}$ are reversible and $K_{s}\preceq L_{s}$ for $s=1,\ldots,t$, then $K_{1}\cdots K_{t}\preceq L_{1}\cdots L_{t}$. In particular, in the time-homogeneous case we have $K^{t}\preceq L^{t}$ for every $t$ if $K$ and $L$ are reversible and $K\preceq L$, and using this we show that (for suitable common initial distributions) the Markov chain $Y$ with kernel $K$ mixes faster than the chain $Z$ with kernel $L$, in the strong sense that at every time $t$ the discrepancy—measured by total variation distance or separation or $L^{2}$-distance—between the law of $Y_{t}$ and $\pi$ is smaller than that between the law of $Z_{t}$ and $\pi$.

Using comparison inequalities together with specialized arguments to remove the stochastic monotonicity restriction, we answer a question of Persi Diaconis by showing that, among all symmetric birth-and-death kernels on the path ${ \mathcal{X}}=\{0,\ldots,n\}$, the one (we call it the uniform chain) that produces fastest convergence from initial state $0$ to the uniform distribution has transition probability $1/2$ in each direction along each edge of the path, with holding probability $1/2$ at each endpoint.

We also use comparison inequalities:

(i) to identify, when $\pi$ is a given log-concave distribution on the path, the fastest-mixing stochastically monotone birth-and-death chain started at $0$, and

(ii) to recover and extend a result of Peres and Winkler that extra updates do not delay mixing for monotone spin systems.

Among the fastest-mixing chains in (i), we show that the chain for uniform $\pi$ is slowest in the sense of maximizing separation at every time.

#### Article information

Source
Ann. Appl. Probab., Volume 23, Number 5 (2013), 1778-1816.

Dates
First available in Project Euclid: 28 August 2013

https://projecteuclid.org/euclid.aoap/1377696298

Digital Object Identifier
doi:10.1214/12-AAP886

Mathematical Reviews number (MathSciNet)
MR3114917

Zentralblatt MATH identifier
1288.60089

#### Citation

Fill, James Allen; Kahn, Jonas. Comparison inequalities and fastest-mixing Markov chains. Ann. Appl. Probab. 23 (2013), no. 5, 1778--1816. doi:10.1214/12-AAP886. https://projecteuclid.org/euclid.aoap/1377696298

#### References

• [1] Aldous, D. and Diaconis, P. (1987). Strong uniform times and finite random walks. Adv. in Appl. Math. 8 69–97.
• [2] Aldous, D. J. and Fill, J. A. Reversible Markov chains and random walks on graphs. Chapter drafts available at http://www.stat.Berkeley.EDU/users/aldous/book.html.
• [3] Benjamini, I., Berger, N., Hoffman, C. and Mossel, E. (2005). Mixing times of the biased card shuffling and the asymmetric exclusion process. Trans. Amer. Math. Soc. 357 3013–3029 (electronic).
• [4] Boyd, S., Diaconis, P., Parrilo, P. and Xiao, L. (2009). Fastest mixing Markov chain on graphs with symmetries. SIAM J. Optim. 20 792–819.
• [5] Boyd, S., Diaconis, P., Sun, J. and Xiao, L. (2006). Fastest mixing Markov chain on a path. Amer. Math. Monthly 113 70–74.
• [6] Boyd, S., Diaconis, P. and Xiao, L. (2004). Fastest mixing Markov chain on a graph. SIAM Rev. 46 667–689.
• [7] Caputo, P., Liggett, T. M. and Richthammer, T. (2010). Proof of Aldous’ spectral gap conjecture. J. Amer. Math. Soc. 23 831–851.
• [8] Diaconis, P. and Fill, J. A. (1990). Strong stationary times via a new form of duality. Ann. Probab. 18 1483–1522.
• [9] Diaconis, P. and Saloff-Coste, L. (1993). Comparison theorems for reversible Markov chains. Ann. Appl. Probab. 3 696–730.
• [10] Diekmann, R., Muthukrishnan, S. and Nayakkankuppam, M. V. (1997). Engineering diffusive load balancing algorithms using experiments. In Solving Irregularly Structured Problems in Parallel. Lecture Notes in Computer Science 1253 111–122. Springer, New York.
• [11] Feller, W. (1968). An Introduction to Probability Theory and Its Applications. Vol. I, 3rd ed. Wiley, New York.
• [12] Fiedler, M. (1990). Absolute algebraic connectivity of trees. Linear Multilinear Algebra 26 85–106.
• [13] Fill, J. A. (1988). Bounds on the coarseness of random sums. Ann. Probab. 16 1644–1664.
• [14] Fill, J. A. (1991). Time to stationarity for a continuous-time Markov chain. Probab. Engrg. Inform. Sci. 5 61–76.
• [15] Fill, J. A. (1998). An interruptible algorithm for perfect sampling via Markov chains. Ann. Appl. Probab. 8 131–162.
• [16] Fill, J. A., Machida, M., Murdoch, D. J. and Rosenthal, J. S. (2000). Extension of Fill’s perfect rejection sampling algorithm to general chains. Random Structures Algorithms 17 290–316.
• [17] Holroyd, A. E. (2011). Some circumstances where extra updates can delay mixing. J. Stat. Phys. 145 1649–1652.
• [18] Karlin, S. and Taylor, H. M. (1975). A First Course in Stochastic Processes, 2nd ed. Academic Press, New York.
• [19] Lange, L. H. and Miller, J. W. (1992). A random ladder game: Permutations, eigenvalues, and convergence of Markov chains. College Math. J. 23 373–385.
• [20] Lovász, L. and Winkler, P. (1995). Mixing of random walks and other diffusions on a graph. In Surveys in Combinatorics, 1995 (Stirling). London Mathematical Society Lecture Note Series 218 119–154. Cambridge Univ. Press, Cambridge.
• [21] Marshall, A. W. and Olkin, I. (1979). Inequalities: Theory of Majorization and Its Applications. Mathematics in Science and Engineering 143. Academic Press, New York.
• [22] Peres, Y. (2005). Mixing for Markov chains and spin systems. Lecture notes, Univ. British Columbia. Summary available at http://www.stat.berkeley.edu/~peres/ubc.pdf.
• [23] Peres, Y. and Winkler, P. (2011). Can extra updates delay mixing? Preprint. Available at arXiv:1112.0603v1 [math.PR].
• [24] Propp, J. and Wilson, D. (1998). Coupling from the past: A user’s guide. In Microsurveys in Discrete Probability (Princeton, NJ, 1997). DIMACS Series in Discrete Mathematics and Theoretical Computer Science 41 181–192. Amer. Math. Soc., Providence, RI.
• [25] Propp, J. G. and Wilson, D. B. (1996). Exact sampling with coupled Markov chains and applications to statistical mechanics. Random Structures Algorithms 9 223–252.
• [26] Propp, J. G. and Wilson, D. B. (1998). How to get a perfectly random sample from a generic Markov chain and generate a random spanning tree of a directed graph. J. Algorithms 27 170–217.
• [27] Roch, S. (2005). Bounding fastest mixing. Electron. Commun. Probab. 10 282–296 (electronic).
• [28] Saloff-Coste, L. and Zúñiga, J. (2007). Convergence of some time inhomogeneous Markov chains via spectral techniques. Stochastic Process. Appl. 117 961–979.
• [29] Saloff-Coste, L. and Zúñiga, J. (2009). Merging for time inhomogeneous finite Markov chains. I. Singular values and stability. Electron. J. Probab. 14 1456–1494.
• [30] Saloff-Coste, L. and Zúñiga, J. (2010). Time inhomogeneous Markov chains with wave-like behavior. Ann. Appl. Probab. 20 1831–1853.
• [31] Saloff-Coste, L. and Zúñiga, J. (2011). Merging for inhomogeneous finite Markov chains, Part II: Nash and log-Sobolev inequalities. Ann. Probab. 39 1161–1203.
• [32] Sun, J., Boyd, S., Xiao, L. and Diaconis, P. (2006). The fastest mixing Markov process on a graph and a connection to a maximum variance unfolding problem. SIAM Rev. 48 681–699.
• [33] Wilson, D. B. (1998). Annotated bibliography of perfectly random sampling with Markov chains. In Microsurveys in Discrete Probability (Princeton, NJ, 1997). DIMACS Series in Discrete Mathematics and Theoretical Computer Science 41 209–220. Amer. Math. Soc., Providence, RI. Latest updated version is posted at http://dbwilson.com/exact/.
• [34] Wilson, D. B. (2000). Layered multishift coupling for use in perfect sampling algorithms (with a primer on CFTP). In Monte Carlo Methods (Toronto, ON, 1998). Fields Institute Communications 26 143–179. Amer. Math. Soc., Providence, RI.