RATE OF ESCAPE OF THE MIXER CHAIN

The mixer chain on a graph G is the following Markov chain: Place tiles on the vertices of G , each tile labeled by its corresponding vertex. A “mixer” moves randomly on the graph, at each step either moving to a randomly chosen neighbor, or swapping the tile at its current position with some randomly chosen adjacent tile. We study the mixer chain on Z , and show that at time t the expected distance to the origin is t 3 / 4 , up to constants. This is a new example of a random walk on a group with rate of escape strictly between t 1 / 2 and t .

After formally defining the mixer chain on general groups, we study the mixer chain on . Our main result, Theorem 2.1, shows that the mixer chain on has degree of escape 3/4. It is not difficult to show (perhaps using ideas from this note) that on transient groups the mixer chain has degree of escape 1. Since all recurrent groups are essentially and 2 , it seems that the mixer chain on other groups cannot give examples of other degrees of escape. As for 2 , one can show that the mixer chain has degree of escape 1. In fact, the ideas in this note suggest that the distance to the origin in the mixer chain on 2 is n log −1/2 (n) up to constants, (we conjecture that this is the case). For the reader interested in logarithmic corrections to the rate of escape, in [2] Erschler gave examples of rates of escape that are almost linear with a variety of logarithmic corrections. Logarithmic corrections are interesting because linear rate of escape is equivalent to the the existence of non-constant bounded harmonic functions, to non-trivial Poisson boundary, and to the positivity of the associated entropy, see [3]. After introducing some notation, we provide a formal definition of the mixer chain as random walk on a Cayley graph. The generalization to general graphs is immediate.
Acknowledgements. I wish to thank Itai Benjamini for suggesting this construction, and for useful discussions. I also wish to thank an anonymous referee for pointing out useful references.

Notation
Let G be a group and U a generating set for G, such that if x ∈ U then x −1 ∈ U (U is called symmetric). The Cayley graph of G with respect to U is the graph with vertex set G and edge set g, h : g −1 h ∈ U . Let be a distribution on U. Then we can define the random walk on G (with respect to U and ) as the Markov chain with state space G and transition matrix P(g, h) = 1 g −1 h ∈ U (g −1 h). We follow the convention that such a process starts from the identity element in G. A permutation of G is a bijection from G to G. The support of a permutation σ, denoted supp(σ), is the set of all elements g ∈ G such that σ(g) = g. Let Σ be the group of all permutations of G with finite support (multiplication is composition of functions). By < g, h > we denote the transposition of g and h; that is, the permutation σ with support g, h such that σ(g) = h, σ(h) = g. By < g 1 , g 2 , . . . , g n > we denote the cyclic permutation σ with support g 1 , . . . , g n , such that σ(g j ) = g j+1 for j < n and σ(g n ) = g 1 . For an element g ∈ G we associate a canonical permutation, denoted by φ g , defined by φ g (h) = gh for all h ∈ G. It is straightforward to verify that the map g → φ g is a homomorphism of groups, and so we use g to denote φ g . Although g ∈ Σ, we have that gσg −1 ∈ Σ for all σ ∈ Σ. We now define a new group, that is in fact the semi-direct product of G and Σ, with respect to the homomorphism g → φ g mentioned above. The group is denoted by G ⋉ Σ, and its elements are G × Σ. Group multiplication is defined by: We leave it to the reader to verify that this is a well-defined group operation. Note that the identity element in this group is (e, id), where id is the identity permutation in Σ and e is the identity element in G. Also, the inverse of (g, σ) is (g −1 , g −1 σ −1 g). We use d(g, h) = d G,U (g, h) to denote the distance between g and h in the group G with respect to the generating set U; i.e., the minimal k such that g −1 h = k j=1 u j for some u 1 , . . . , u k ∈ U. The generating set also provides us with a graph structure. g and h are said to be adjacent if d(g, h) = 1, that is if g −1 h ∈ U. A path γ in G (with respect to the generating set U) is a sequence (γ 0 , γ 1 , . . . , γ n ). |γ| denotes the length of the path, which is defined as the length of the sequence minus 1 (in this case |γ| = n).

⊓ ⊔
We are now ready to define the mixer chain: Let G be a group with finite symmetric generating set U. The mixer chain on G (with respect to U) is the random walk on the group G ⋉ Σ with respect to uniform measure on the generating set Υ = {(u, id), (e, < e, u >) : u ∈ U}.
An equivalent way of viewing this chain is viewing the state (g, σ) ∈ G ⋉ Σ as follows: The first coordinate corresponds to the position of the mixer on G. The second coordinate corresponds to the placing of the different tiles, so the tile marked x is placed on the vertex σ(x). By Definition 1.2, the mixer chooses uniformly an adjacent vertex of G, say h. Then, with probability 1/2 the mixer swaps the tiles on h and g, and with probability 1/2 it moves to h. The identity element in G ⋉ Σ is (e, id), so the mixer starts at e with all tiles on their corresponding vertices (the identity permutation).

Distance Bounds
In this section we show that the distance of an element in G ⋉ Σ to (e, id) is essentially governed by the sum of the distances of each individual tile to its origin.
The covering number of g and σ, denoted Cov(g, σ), is the minimal length of a path γ, starting at g, that covers σ; i.e.

The Mixer Chain on
We now consider the mixer chain on , with {1, −1} as the symmetric generating set. We denote by ω t = (S t , σ t ) t≥0 the mixer chain on .
For ω ∈ ⋉ Σ we denote by D(ω) the distance of ω from (0, id) (with respect to the generating set Υ, see Definition 1.2). Denote by D t = D(ω t ) the distance of the chain at time t from the origin.
As stated above, we show that the mixer chain on has degree of escape 3/4. In fact, we prove slightly stronger bounds on the distance to the origin at time t.

The proof of Theorem 2.1 is in Section 3.
For z ∈ , denote by X t (z) = |σ t (z) − z|, the distance of the tile marked z to its origin at time t.
which is a finite sum for any given t. As shown in Propositions 1.3 and 1.4, X t approximates D t up to certain factors.
V t (z) is the number of times that the mixer visits the tile marked z, up to time t.

Distribution of X t (z)
The following proposition states that the "mirror image" of the mixer chain has the same distribution as the original chain. We omit a formal proof, as the proposition follows from the symmetry of the walk.
Essentially the proof is as follows. We consider the successive time at which the mixer visits the tile marked z. The movement of the tile at these times is a lazy random walk with the number of steps equal to the number of visits. The difference between the position of the tile at time t and its position at the last visit is at most 1, and the difference between the tile at time 0 and its position at the first visit is at most 1. B is the random variable that measures these two differences.

⊓ ⊔
We continue with the proof of Lemma 2.3.
We have the equality of events V t (z) = k = T k (z) ≤ t < T k+1 (z) . Let t 1 , t 2 , . . . , t k , t k+1 be such that and condition on the event = T 1 (z) = t 1 , . . . , T k+1 (z) = t k+1 . Assume further that t k ≤ t < t k+1 , so that V t (z) = k. Write Proof. Let W t be a lazy random walk on . Note that 2W t has the same distribution as S ′ 2t where S ′ t is a simple random walk on . It is well known (see e.g. [5]), that there exist universal constants c 1 , C 1 > 0 such that for all t ≥ 0, By Lemma 2.3, we know that for any k ≥ 0, Thus, summing over all k, there exists constants c 2 , C 2 > 0 such that ⊓ ⊔ Lemma 2.6. Let S ′ t be a simple random walk on started at S ′ 0 = 0, and let Then, for any z ∈ , and any k ∈ , is the number of times M t visits z up to time t. M t is a Markov chain on with the following step distribution.
Specifically, M t is simple symmetric when at z, lazy symmetric when not adjacent to z, and has a drift towards z when adjacent to z. Define N t to be the following Markov chain on : N 0 = 0, and for all t ≥ 0, So N t is simple symmetric at z, and lazy symmetric when not at z. Let be the number of times N t visits z up to time t. Define inductively ρ 0 = ρ ′ 0 = 0 and for j ≥ 0, and Thus, we can couple M t+1 and N t+1 so that M t+1 ≤ N t+1 . Similarly, if N t ≤ M t < z then M t+1 moves towards z with higher probability than N t+1 , and they both move away from z with probability 1/4. So we can couple M t+1 and N t+1 so that M t+1 ≥ N t+1 . If N t = M t = z then M t+1 and N t+1 have the same distribution, so they can be coupled so that N t+1 = M t+1 . Thus, we can couple M t and N t so that for all j ≥ 0, ρ j ≤ ρ ′ j a.s. Let S ′ t be a simple random walk on . For x ∈ , let That is, τ x is the first time a simple random walk started at 2x hits 2z (this is necessarily an even number). In [5, Chapter 9] it is shown that τ x has the same distribution as τ 2z − 2|z − x|. Note that if N t = z then S ′ 2t+2 − S ′ 2t has the same distribution as 2(N t+1 − N t ). Since |N ρ ′ j−1 +1 − z| = 1, we get that for all j ≥ 2, ρ ′ j has the same distribution as 1 2 (τ 2z − 2) + 1. Also, ρ ′ 1 has the same distribution as 1 2 τ 0 if z = 0, and the same distribution as 1 2 (τ 2z − 2) + 1 if z = 0. Hence, we conclude that for any k ≥ 1, k j=1 ρ ′ j has the same distribution as 1 2 k j=1ρ j , where ρ j j≥1 are defined byρ j+1 = min 2t ≥ 2 : S ′ρ j +2t = 2z . Finally note that V t (z) ≥ k if and only if k j=1 ρ j ≤ t, V ′ t (z) ≥ k if and only if k j=1 ρ ′ j ≤ t, and L t (2z) ≥ k if and only if k j=1ρ j ≤ t. Thus, under the above coupling, for all t ≥ 0, V t (z) ≥ V ′ t (z) a.s. Also, V ′ t (z) has the same distribution as L 2t (2z). The lemma follows. ⊓ ⊔