Connectedness of the Free Uniform Spanning Forest as a function of edge weights

Let G be the Cartesian product of a regular tree T and a ﬁnite connected transitive graph H . It is shown in [3] that the Free Uniform Spanning Forest ( FSF ) of this graph may not be connected, but the dependence of this connectedness on H remains somewhat mysterious. We study the case when a positive weight w is put on the edges of the H -copies in G , and conjecture that the connectedness of the FSF exhibits a phase transition. For large enough w we show that the FSF is connected, while for a large family of H and T , the FSF is disconnected when w is small (relying on [3]). Finally, we prove that when H is the graph of one edge, then for any w , the FSF is a single tree, and we give an explicit formula for the distribution of the distance between two points within the tree.


Introduction
Consider some finite graph H with a weight function ("conductances")ŵ : E(H) → R + 0 on its edges. One may take an unweighted graph and view it as one where the weights are constant 1. Choose a spanning tree of H at random, where the probability of a spanning tree T will be proportional to Π e∈E(T )ŵ (e). The so-defined probability measure is called the Uniform Spanning Tree (UST) of (H,ŵ). For a given infinite graph G and conductancesŵ, consider some exhaustion of G by a sequence of connected finite graphs G n , and let UST(G n ) be the UST of the weighted graph (G n ,ŵ| Gn ). It is known that the weak limit of UST(G n ) exists [3], meaning that for any e 1 , . . . , e k , f 1 , . . . , f m ∈ E(G), P(e 1 , . . . , e k ∈ UST(G n ), f 1 , . . . , f m ∈ UST(G n )) converges, and to the same limit for any choice of the sequence G n . The limiting measure is called the Free Uniform Spanning Forest (FUSF or FSF) of (G,ŵ). See [2] for background, references, and the basic properties of the FSF.
It was generally expected that the FSF of "tree-like graphs" would consist of a single tree, until Gábor Pete and the last author showed in [4] that for suitably chosen d and connected finite transitive graph H, the Cartesian product T d H of the d-regular tree T d and H has a disconnected FSF. From the proof, however, it is not clear what happens to the disconnectedness of the FSF if we do some natural changes to d and H, e.g., increase d with a fixed H, or fix d and take a lift of H. No monotonicity result of this type is known. A question in the same spirit is to ask how the connectedness of the FSF changes if we put constant positive weight w on every edge of the H-copies in T d H and then change w. What happens if w is very small or large? Does there always exist some w where there is a single FSF component? Is there always some w where there are infinitely many FSF components (with an H that has at least 2 vertices)? Is there any kind of monotonicity in w, and perhaps even a critical value that separates the phases of disconnectedness and connectedness? The present paper contributes to the understanding of these questions. In particular, initial steps are taken in Conjecture 1.1.
Let H be a finite connected graph and T d be the d-regular tree. For an arbitrary given w > 0, define the weight functionŵ on the edges of G = T d H so thatŵ(e) = 1 if e is of the form {(x 1 , y), (x 2 , y)} andŵ(e) = w if e is of the form {(x, y 1 ), (x, y 2 )}. Define FSF w (G) as the FSF of (G,ŵ). Conjecture 1.1. If FSF w and FSF w are connected for some w > w > 0 then FSF w is connected for every w ∈ [w , w ]. Similar statement holds for disconnectedness. Moreover, there exists a γ ∈ [0, ∞] such that FSF w has a unique component whenever w > γ, and FSF w has infinitely many components whenever w < γ.
We mention that having more than one component automatically implies having infinitely many for a much wider class of transitive graphs than the ones considered here ( [1], [6], or see [4] for a short direct proof for the special product graphs that we consider here).
The simplest nontrivial example of a graph of the form T d H is the case when H = K 2 is a single edge. It is not clear what to expect: on one hand the graph may be "too close" to the tree to produce disconnected FSF, on the other hand one may speculate that for small enough w the relatively large weighted degree of the tree could be the reason for a similar phenomenon as in [4] and make the FSF fall apart. Pengfei Tang has shown in [5] that the FSF is connected for the unweighted question for T d K 2 . (His proof was worked out for a slightly different graph, but it is mentioned in [5] that a similar argument can be applied for G = T d K 2 .) The method of [4], which needs H to be relatively large, did not give an insight into this special case either. We settle this question of H = K 2 through an enumeration, which will also enable us to bound the decay of the distance between two points (Lemma 2.10).
Tang's method in Subsection 5.2 of [5], using effective resistance bounds, seems to be adaptable to show Theorem 1.2, but it will not give the precise quantitative result for the connectivity within the FSF as Lemma 2.10. Theorem 1.2 shows that the phase transition in Conjecture 1.1 is trivial when H = K 2 in the sense that γ = 0. As we will see in the next part of the paper, such degeneracy can never happen with γ = ∞. Namely, we verify Conjecture 1.1 for large enough w < ∞, with only the assumption that H is regular, finite and connected. The matter of how typical γ = 0 may be is unclear to us; the open questions and Theorem 1.1 in [4] are certainly related to this. As a further contribution towards Conjecture 1.1, we roughly sketch how the arguments in [4] can T d and H, as in the next theorem. Theorem 1.3. For every finite connected regular graph H and d-regular tree T d , there exists a W < ∞ such that for every w > W the forest FSF w (T d H) is connected. Conversely, if H is transitive, d is large enough compared to the degree in H, and |H| > d 5/2 , then for every w ≤ 1 the forest FSF w (T d H) has infinitely many components almost surely.

Notation
Denote by t(G) the (weighted) number of spanning trees of a finite graph G. Let T wH be the graph obtained by the Cartesian product of T and H, with w weights on the edges of the form {(x, y 1 ), (x, y 2 )}, and weight 1 on the rest of the edges. For shorthand throughout Section 2 we useT = T wK 2 for any graph T . In a graph product T wH we call bag the subgraphs of the form {v} × wH where v ∈ T . Let T n denote the ball with radius n around a fixed vertex u in T d . With a (convenient) slight redundancy, the FSF of T wH is the same object as the FSF w of T H.
We will rely on one particular consequence of Wilson's algorithm on finite graphs [7], namely that for a finite connected graph G, the path between points a, b ∈ V (G) within UST(G) has the same distribution as the loop-erased random walk path LERW G (a → b) from a to b, which is constructed as follows. Run random walk in G starting from a until hitting b, and erase all the loops in the order of their appearance along the walk, to obtain a simple path from a to b. The same link between the UST and LERW is true when G has positive edge weights, in which case random walk on this network is understood instead of simple random walk, with weights being the conductances. See Chapter 4.1 of [2] for more details. In general, for an arbitrary walk (X 1 , . . . , X n ), LE(X 1 , . . . , X n ) will denote its loop-erasure. Let u ∈ T d a fixed vertex, remember that T n is the ball around u with radius n. Definition 2.1. Define the perfect (d − 1)-ary tree with height n recursively in the following way. A perfect (d − 1)-ary tree with height 0 is a single vertex, the root. For n > 0 a perfect (d − 1)-ary tree with height n has a root, and it is connected with the roots of d − 1 pieces of perfect (d − 1)-ary trees with height n − 1.
For brevity, from now on we call the perfect (d − 1)-ary tree simply as a perfect tree and we denote the height n perfect tree by A n .
An alternative way to define the perfect tree is that if we delete an edge incident with u from T n , then the component containing u is A n (and the other component is an A n−1 ).
Let o be the root of Note that each edge ofÂ n is exactly in one of the G i 's except e, which is contained in all of the G i 's.

Definition 2.2.
For a shorthand of t(Â n ) we use a n , and let a n denote the weighted number of spanning trees ofÂ n containing e.
We prove recursive formulas for these quantities.
Proof. The w multiplier comes from the weight w on e. Note that a subgraph T ofÂ n is a spanning tree containing e if and only if T ∩ G i is a spanning tree containing e for all 1 ≤ i ≤ d − 1.
So we need to count the weighted number of spanning trees of G i containing e (not multiplying with the weight w on e as we already counted the weight of e). Let T be such a spanning tree. Consider the edges So this is 1 w a n−1 weighted options. Therefore, independently for each G i , we have 2a n−1 + 1 w a n−1 weighted possibilities. The conclusion follows.
Proof. The weighted number of spanning trees that contain e is a n . It is easy to see that T is a spanning tree that does not contain e, if and only if T ∩ G i is a spanning tree of G i not containing e for some i (see an example in the figure), and T ∩ G j is a graph not containing e, with T ∩ G j ∪ {e} is a spanning tree of G j for all j = i. We n−1 is a spanning tree ofÂ i n−1 , so this is a n−1 weighted options. For the other j's we need to count the number of weighted spanning trees of G j containing the edge e (without the weight of e), which is exactly the same we did in Lemma 2.3, so we have 2a n−1 + 1 w a n−1 weighted possibilities, from which the proof is complete.
t(T n ) = 2a n a n−1 + 1 w (a n a n−1 + a n a n−1 ) Proof. T n can be constructed by taking an A n graph and an A n−1 graph and connecting their roots. In each spanning tree T ofT n either T ∩Â n or T ∩Â n−1 is a spanning tree, or both (see an example in the figures). If both, then we have 2 options to connect them, so it is 2a n a n−1 weighted options. IfÂ n ∩ T is a spanning tree, butÂ n−1 ∩ T is disconnected, then we have to put both edges betweenÂ n andÂ n−1 into T , and as in Lemma 2.3, we can think ofÂ n−1 ∩ T as a spanning tree containing the edge between the 2 vertices in the bag of the root ofÂ n−1 , minus this edge, so it is 1 w a n a n−1 weighted possibilities. In the same way we get 1 w a n a n−1 for the third case. Summing these we get the desired result.
Let t m (T n ) be the number of spanning trees inT n with the unique path from (u, 0) to (u, 1) touching exactly m bags. Note that as a bag only contains 2 vertices, we don't have plenty of options for a path between (u, 0) and (u, 1). The only way for a path touching m bags is that we do m − 1 moves in tree edges, going into the m'th bag, then in the m'th step we move within the bag, and then m − 1 steps back up in tree edges.
, and in the m = 1 case, for n > 1 the following is true. for eachÂ n−i so that the whole subgraph is a spanning tree. Multiplying these we get the number of spanning trees with this path.

Distribution of the distances in
Let A be the infinite tree with degrees d, except one vertex, which has degree d − 1, call this special vertex o. Let e be the edge in the bag of o inÂ. Define c := P(e ∈ FSF w (A K 2 )). The sequenceÂ n is an exhaustion ofÂ, so by the definition of FSF w we have c = lim n→∞ a n a n .
Let c n = a n an . n−1 a n = s. a n = a n + (d − 1)a n−1 2a n−1 + 1 w a n−1 d−2 = c n a n + (d − 1)a n s n 2 + c n−1 w Dividing this by a n , taking n → ∞ and substituting the identity from Lemma 2.7. we get After rearranging we get a quadratic equation of c: The constant term is negative, so we have two real roots, a negative and a positive and c ≥ 0, so we get the following Theorem.
Proof. From (2.1) with quadratic formula. . This number has another meaning. This is the probability that for a u ∈ T d , (u, 0) and (u, 1) belong to the same component of FSF w in T d K 2 and their distance in the tree is 2m − 1, in other words the path between them touches m bags. Proof. We call two positive sequences (u n , v n ) equivalent (u n ∼ v n ) if lim n→∞ u n /v n = 1.
We are going to prove that t m (T n ) and t(T n ) are asymptotically the same as a constant times a n a n−1 . lim n→∞ t m (T n ) t(T n ) = lim n→∞ a n a n−1 t(T n ) lim n→∞ t m (T n ) a n a n−1 .
Recall the following constants: lim n→∞ a n a n = c and lim n→∞ a d−1 n−1 a n = s.
Using these, and Lemma 2.5, lim n→∞ a n a n−1 t(T n ) = lim n→∞ a n a n−1 2a n a n−1 + 1 w (a n a n−1 + a n a n−1 ) = lim n→∞ 1 2 + 1 w ( a n−1 an−1 + a n an ) From Lemma 2.6, lim n→∞ t m (T n ) a n a n−1 = lim n→∞ d(d − 1) m−2 w 2a n−1 + 1 w a n−1 2a n−m + 1 w a n−m m i=1 2a n−i + 1 w a n−i d−2 a n a n−1 .
Here 2a n−i + 1 w a n−i ∼ a n−i 2 + c w . Hence .
Using the fact that a d−1 n−i ∼ sa n−i+1 , one can prove easily by induction that a n−1 m i=1 a d−2 n−i a n−m ∼ a n−1 a n s m . Thus, combining the two calculations, t m (T n ) a n a n−1 Proof. From the m = 1 case of Lemma 2.6, we have lim n→∞ t 1 (T n ) a n a n−1 = lim n→∞ w 2a n−1 + 1 w a n−1 d a n a n−1 = lim n→∞ ws n a n a n−1 2 + cn−1 w d a n a n−1 = ws 2 + c w d .
In the proof of Lemma 2.10, we calculated lim n→∞ a n a n−1 t(T n ) = 1 2 + 2c w .
Combining these and using Lemma 2.7, we have = lim n→∞ a n a n−1 t(T n ) lim n→∞ t 1 (T n ) a n a n−1 From Lemma 2.10 and Lemma 2.11 we know that the distribution of the distance of two vertices in the same bag in the FSF w has an "almost" geometric distribution; the sequence q 2 , q 3 , ... is a geometric progression but q 1 does not fit in. Proof. We want to prove that the path between (u, 0) and (u, 1) is almost surely finite, so ∞ m=1 q m = 1.
Combining these with Lemma 2.10 and Lemma 2.11, we have Lemma 2.13. Let H be an arbitrary finite connected graph and consider the weighted graph G = T d wH. If any pair a, b of vertices in the same bag belongs to the same component of FSF w almost surely, then the FSF w is almost surely connected.
Proof. Take two adjacent bags. Let the set of edges between them called E . The event that at least one edge of E is in FSF w is a cylinder event, for every graph G n of an exhausting finite sequence for G this event has probability one to hold for the UST. Hence in the FSF w there is an edge from E with probability one. Thus there are always two connected vertices in the two adjacent bags. By assumption, all vertices within a bag are in the same component, therefore all vertices in these two adjacent bags are in the same component. This is true for any two adjacent bags, thus for all edges in T d .
Using countable intersection, we conclude that the FSF w of that graph is connected with probability one.
Now we have everything, to prove the main result of this section: Proof of Theorem 1.2. From Lemma 2.12. and Lemma 2.13. the statement follows.
Remark 2.14. It is a natural question to ask whether this method can be generalized to other graphs instead of K 2 . Unfortunately, we strongly relied on the fact that inT n a path between (u, 0) and (u, 1) looks quite nice, while if we change K 2 to some larger graph then plenty of other options arise which we cannot handle with this enumerative method.

The general case, large and small weights
In this section we are going to prove Theorem 1.3. The first part will follow from the next theorem. Theorem 3.1. Given an arbitrary d > 2 and a finite, regular, connected graph H, there is a W > 0 such that the FSF w of the graph G = T d H is almost surely connected for all w > W .
As before, denote by T n the ball of radius n in T d . Let U be the central bag of T n wH: the bag that corresponds to the center of this ball.

Definition 3.2.
A trip is a walk (X 1 , X 2 , . . . , X T ) such that X 1 ∈ U , X T ∈ U , and X i ∈ U whenever i ∈ {2, . . . , T − 1}.  . , X T ) be a walk in T n wH, with X 1 in the central bag U of T n wH. Suppose that X = (X k , X k+1 , . . . , X k ) is some subwalk which is a trip and intersects bag D. Assume that D is not memorable for X . Then the loop-erasure of (X 1 , . . . , X k ) does not intersect D.
Proof. By our assumptions there exists a largest number τ with k < τ < k and X τ ∈ D. Since D is not memorable, there exists a bag D that separates U and D, and with the property that V (D ) = V (D ) ∩ {X τ +1 , . . . , X k }. Let t be the first time after τ that we enter D . Such a t exists, because X τ ∈ D and X k ∈ U . Let L be LE(X 1 , . . . , X t ). If L ∩ D = ∅, then the claim is proved, because we do not visit D after t > τ . Otherwise the first time that L enters D is strictly before t (since L has to enter D before entering D and reentering D at X t ). Let this first vertex of entrance be v. By assumption on X , (X t , . . . , X k ) visits every vertex of D . Let t ≥ t be the first time that X t = v. Then the loop-erasure of (X 1 , . . . , X t ) erases everything that happened after the first entrance to D at v. In particular, it erases every step in D before t , so LE(X 1 , . . . , X t ) ∩ D = ∅. Since k > t (X is a trip), and no step after t > τ is in D, the claim is proved. Lemma 3.5. There exists a W > 0 such that for any α > 0 there is an m such that the following holds for every w > W . Let u ∈ U , n > m and X = (X 1 , . . . , X T ) be a trip in T n wH with X 1 = u. Then we have P(X has a memorable bag outside T m wH) < α.
Proof. Choose ε := 1 2(d−1) . Let w > W , where we specify W at the end of this paragraph. If (Z 1 , Z 2 , ...) is a random walk in T d wH started from a bag B, then let λ be the first time when it exits B. If w was large enough, we have for any starting vertex x = Z 1 and last vertex y = Z λ because the minimum over x and y of the probability on the left tends to 1 as w goes to infinity. Fix W so that the above inequality holds.
Fix bag D; we will use notation from Definition 3.3. Let t be the first time after τ that we enter D . Let A x be the event that the random walk started from a point x ∈ D visits every vertex of D before leaving D . Denote by P x the distribution of a random walk Y = (Y 1 , . . . , Y R ) started from x = Y 1 and stopped at the first entrance to U . Let B x be the event that Y does not visit D. Then P((X t , . . . , where the inequality is from (3.1), and the last equality follows from the fact that B x is independent of A x , because it only depends on the steps taken in the tree-coordinate and hence it is independent of the steps in the H-coordinates between two tree-coordinate steps. (To see this, note that the random walk path (Y 1 , . . . , Y R ) by P x could be generated by first generating a suitable random walk path T in T d , and then adding a suitably chosen random number of random H-steps in between every consecutive pair of steps of T , independently from each other and from T .) Since x was arbitrary, from (3.2) we obtain P((X t , . . . , X T ) ∈ A Xt ) ≥ 1 − ε. Let U = D 1 , D 2 , . . . , D = D be the ray of bags between U and D. Denote by t i the first time that X enters D i after τ and let r i be the first time exiting D i after t i . Finally, let A i be the the event that (X ti , . . . , X T ) visits every vertex of D i before leaving D i , in other words, {X ti , . . . , X ri } = V (D i ). Note that, conditional on {X ti } and {X ri }, the events {A j } are independent, hence from the uniform lower bound (3.3) we have: Using the law of total probability P ∃i ∈ [2, . . . , − 1] : We have just shown that D is memorable for X with probability less than ε −2 . There are d(d − 1) s−1 vertices of T n with distance s from the root for all 1 ≤ s ≤ n, so P(X has a memorable bag outside T m wH) ≤ .
By definition ε(d − 1) < 1. The number on the right hand side does not depend on n, thus we can choose m big enough so that it is less than α.
Proof of Theorem 3.1. Let W be as in Lemma 3.5. Let α 0 > 0 be arbitrary. We want to prove that if w > W then for all vertices a, b of G, there exists an m with This is equivalent with the definition of connectedness of the FSF w in T d H, because by Wilson's algorithm LERW Tn wH (a → b) has the same distribution as the path between a and b in UST(T n wH). By Lemma 2.13., one may assume that a and b are in the same bag, U . Define h as the minimum of the probability over all pairs x = y ∈ U that the random walk in T n wH started from x hits y before leaving U . Let k be a positive integer, chosen to satisfy P(Geom(h) ≥ k) < α 0 /2, where Geom(h) denotes a geometric random variable of parameter h.
Choose m as in Lemma 3.5, with α := α 0 /2k. Denote by X a random walk started from a in T n wH and stopped when first hitting b. One can construct X as follows. Start random walk from a. If we hit b before leaving U , we are finished, otherwise let X s1 be the last step of this walk in U before first leaving U . Then consider the trip (X s1 , . . . , X t1 ). From the last vertex X t1 ∈ U of this trip, continue the random walk until either hitting b or exiting U . The probability of the former is at least h; otherwise let X s2 be the last vertex in U before leaving U , and starting from this vertex generate the trip (X s2 , . . . , X t2 ). Continue similarly, until at some point we hit b and at that point the construction of X is finished. We see that after the end of every trip we had probability at least h to hit b, hence the total number of trips needed is stochastically dominated by a geometric random variable of parameter h. Let J be such a random variable. If LE(X ) intersects bag D, then D is memorable for one of the sub-trips of X by Proposition 3.4. The probability that a trip has a memorable bag outside of T m wH is less than α by Lemma 3.5. A union bound gives us P(LE(X ) leaves T m cH) < P(J > k) + kα ≤ α 0 , completing the proof.
Proof of Theorem 1.3. The first part of the theorem is essentially Theorem 3.1.
For the second part, the case of small w, we have the same conditions on the graph as in the unweighted case of Theorem 1.1 in [4] and one could repeat the arguments therein, with minor modifications which we sketch next. The result of Section 2 about random walk on the tree is obviously unchanged, while Lemma 3.1 also remains valid, with a different constant b, for the following reason. Replace 2k 6 in (3.2) by 2k 6 /w. Then the entire paragraph containing (3.2) remains valid if we change every occurrence of d to wd, and that of 2k 6 to 2k 6 /w. The rest of the proof of Lemma 3.1 in [4] goes through without any change. The "second ingredient", as explained after the proof of Lemma 3.1, is based on the fact that random walk does not spend much time in a bag. The