Random laminations and multitype branching processes

We consider multitype branching processes arising in the study of random laminations of the disk. We classify these processes according to their subcritical or supercritical behavior and provide Kolmogorov-type estimates in the critical case corresponding to the random recursive lamination process of [1]. The proofs use the infinite dimensional Perron-Frobenius theory and quasi-stationary distributions.


Introduction
In this note we are interested in multitype branching processes that arise in the study of random recursive laminations.In order to introduce and motivate our results, let us briefly recall the basic construction of [1].Consider a sequence U 1 , V 1 , U 2 , V 2 , . . . of independent random variables, which are uniformly distributed over the unit circle S 1 .We then construct inductively a sequence L 1 , L 2 , . . . of random closed subsets of the closed unit disk D. To start with, L 1 is set to be the (Euclidean) chord [U 1 V 1 ] with endpoints U 1 and V 1 .Then at step n + 1, we consider two cases.Either the chord [U n+1 V n+1 ] intersects L n , and we put L n+1 = L n .Or the chord [U n+1 V n+1 ] does not intersect L n , and we put L n+1 = L n ∪ [U n+1 V n+1 ].Thus, for every integer n 1, L n is a disjoint union of random chords.See Fig. 1.A fragment of L n is a connected component of D\L n .These fragments have a natural genealogy that we now describe.The first fragment, D, is represented by ∅.Then the first chord [U 1 V 1 ] splits D into two fragments, which are viewed as the offspring of ∅.We then order these fragments in a random way: With probability 1/2, the first child of ∅, which is represented by 0, corresponds to the largest fragment and the second child, which is represented by 1, corresponds to the other fragment.With probability 1/2 we do the contrary.We then iterate this device (see Fig. 2) so that each fragment appearing during the splitting process is labeled by an element of the infinite binary tree where {0, 1} 0 = {∅}.
If F is a fragment, we call end of F , any connected component of F ∩ S 1 .For convenience, the full disk D is viewed as a fragment with 0 end.Consequently, we can associate to any u ∈ T 2 a label (u) that corresponds to the number of ends of the corresponding fragment in the above process.Lemma 5.5 of [1] then entails that this random labeling of T 2 is described by the following branching mechanism: For any u ∈ T 2 labeled m 0, choose m 1 ∈ {0, 1, . . ., m} uniformly at random and assign the values 1 + m 1 and 1 + m − m 1 to the two children of u.This is the multitype branching process we will be interested in.See Fig. 2. Notice that we split the fragments according to the order of appearance of the chords, thus the binary tree on the right-hand side seems stretched.
We can also define a random labeling by using the above branching mechanism but starting with a value a 0 at the root ∅ of T 2 , the probability distribution of this process will be denoted P a and its relative expectation E a .A ray is an infinite geodesic path u = (u 1 , u 2 , . ..) ∈ {0, 1} N starting from the root ∅ in T 2 .For any ray u = (u 1 , . . ., u n , . ..) or any word of finite length u = (u 1 , . . ., u n ), we denote by [u] i or [u] i the word (u 1 , . . ., u i ) for 1 i n, and [u] 0 = ∅.
Theorem ([1, Lemma 5.5]).Almost surely, there exists no ray u along which all the labels starting from 4 are bigger than or equal to 4, The starting label 4 does not play any special role and can be replaced by any value bigger than 4.This theorem was proved and used in [1] to study certain properties of the random closed subset L ∞ = ∪L n , and in particular to prove that it is almost surely a maximal lamination (roughly speaking that the complement of L ∞ is made of disjoint triangles), see [1,Proposition 5.4].One of the purposes of this note is to provide quantitative estimates related to this theorem.Specifically let be the set of paths in T 2 joining the root to the level n along which the labels are bigger than or equal to 4.
Theorem 1.1.The expected number of paths starting from the root and reaching level n along which the labels starting from 4 are bigger than or equal to 4 satisfies Furthermore, there exist two constants 0 < c 1 < c 2 < ∞ such that the probability that G n = ∅ satisfies Remark 1.2.These estimates are reminiscent of the critical case for Galton-Watson processes with finite variance σ2 < ∞.Indeed if H n denotes the number of vertices at height n in such a process then E [H n ] = 1 and Kolmogorov's estimate [2] implies that The proof of Theorem 1.1 relies on identifying the quasi-stationary distribution of the labels along a fixed ray conditioned to stay bigger than or equal to 4. This is done in Section 2. In Section 3, we also study analogues of this branching random walk on the k-ary tree, for k 3, coming from a natural generalization of the process (L n ) n 0 where we replace chords by triangles, squares... see Fig. 3.We prove in these cases that there is no critical value playing the role of 4 in the binary case.
starting from x 0 4 at the root.Then (X n ) n 0 is a homogeneous Markov chain with transition probabilities given by We first recall some results derived in [1].If F n is the canonical filtration of (X n ) n 0 then a straightforward calculation leads to is a martingale starting from x 0 − 2. For i 1, we let T i be the stopping time T i = inf{n 0 : X n = i}, and T = T 1 ∧ T 2 ∧ T 3 .By the stopping theorem applied to the martingale (M n ) n 0 , we obtain for every n 0, One can easily check from the transition kernel of the Markov chain (X n ) n 0 that for every i 1, Hence, the equality in the last display becomes or equivalently Our strategy here is to compute the stationary distribution of X n conditionally on the non extinction event {T > n}, in order to prove the convergence of E 4 [X n | T > n] and finally to get asymptotics for P 4 [T > n].Before any calculation, we make a couple of simple remarks. Obviously 2, and thus we get 2 n P x 0 (T > n) Since there are exactly 2 n paths joining the root ∅ of T 2 to the level n, we deduce that the number #G n of paths joining ∅ to the level n along which the labels are bigger than or equal to 4 satisfies Notice that a simple argument shows that if 4 x 0 x 1 then the chain X n starting from x 0 and the chain X n starting from x 1 can be coupled in such a way that X n X n for all n 0.

The quasi-stationary distribution
We consider the substochastic matrix of the Markov chain X n killed when it reaches 1, 2 or 3: This is the matrix ( P2 (x, y)) x,y 4 given by We will show that P2 is a 2-recurrent positive matrix, in the sense of [3, Lemma 1].For that purpose we seek left and right non-negative eigenvectors of P2 for the eigenvalue 1/2.In other words we look for two sequences (g(x)) x 4 and (f (x)) x 4 of non-negative real numbers such that f (4) = g(4) = 1 (normalization) and for every x 4 We start with the left eigenvector g.From (5), we get g(5) = g(4) = 0, and the last observations lead to the following differential equation for G with the condition G(z) = z 5 /5 + o(z 5 ).A simple computation yields G(z) = 3/4 exp(2z)(z − 1) 2 +(z 3 /2+3z 2 /4−3/4).After normalization, the generating function G 1/2 (z) = i 4 g 1/2 (i)z i of the unique probability distribution g 1/2 which is a left eigenvector for the eigenvalue 1/2 is given by This left eigenvector is called the quasi-stationary distribution of X n conditioned on nonextinction.For the right eigenvector f, a similar approach using generating functions is possible, but it is also easy to check by induction that . Hence the condition (iii) of Lemma 1 in [3] is fulfilled and the substochastic matrix P2 is 2-recurrent positive.For every x 4, set where v stands for the "vector" (v i ) i 4 with v 4 = 1 and v i = 0 if i 5. Theorem 3.1 of [3] then implies that Unfortunately this convergence does not immediately imply that where X is distributed according to g 1/2 .But this will follow from the next proposition.
Proof.By induction on n 0. For n = 0 the statement is true.Suppose it holds for n 0. By the definition of q n+1 , for x 4 we have We need to verify that, for every x 4, we have q n+1 (x)g 1/2 (x + 1) q n+1 (x + 1)g 1/2 (x) or equivalently, using (8) and (5) with g = g 1/2 , that z x∨4 For x = 4 this inequality holds.Otherwise, if x > 4, we have to prove that Set A x = qn(x−1) g 1/2 (x−1) to simplify notation.The induction hypothesis guarantees that q n (z) A x g 1/2 (z) for every z x, and therefore This gives the bound (9) and completes the proof of the proposition.
By Proposition 2.1 we have for every x 1, qn(x) C, where C = sup n 0 qn(4) g 1/2 (4) < ∞ by (7).This allows us to apply dominated convergence to get Using (3) we then conclude that

Proof of Theorem 1.1
We first introduce some notation.We denote the tree T 2 truncated at level n by T 2 .For every u = (u 1 , . . ., u n ) ∈ {0, 1} n , and every j ∈ {0, 1, . . ., n}, recall that [u] j = (u 1 , . . ., u j ), and if j 1, also set [u] * j = (u 1 , . . ., u j−1 , 1 − u j ).We say that j ∈ {0, 1, . . ., n − 1} is a left turn (resp.right turn) of u if u j+1 = 0 (resp.u j+1 = 1).A down step of u is a time Note that if j is a down step of u then ( The set of all j ∈ {0, 1, . . ., n − 1} that are left turns, resp.right turns, resp.down steps, of u is denoted by L(u), resp.R(u), resp D(u).We endow T 2 with the lexicographical order , and say that a path u ∈ {0, 1} n is on the left (resp.right) of v ∈ {0, 1} n if u v (resp.v u).A vertex of {0, 1} n will be identified with the path it defines in T (n) 2 .If u, v ∈ T 2 we let u ∧ v be the last common ancestor of u and v.
Proof of Theorem 1.1.Lower bound.We use a second moment method.Recall that is the set of all paths in T (n) 2 from the root to the level n along which the labels are bigger than or equal to 4. A path in G n is called "good".Using (10), we can compute the expected number of good paths and get as n → ∞, which proves the convergence (1) in the theorem.For u ∈ G n and j ∈ {0, 1, . . ., n}, we let Right(u, j) be the set of all good paths to the right of u that diverge from u at level j, In particular, if j is a right turn for u, that is u j+1 = 1, then Right(u, j) = ∅.Furthermore Right(u, n) = {u}.Let us fix a path u ∈ {0, 1} n , and condition on u ∈ G n and on the labels along u.Let j ∈ {0, 1, 2, . . ., n}.Note that the first vertex of a path in Right(u, j) that is not an ancestor of u is [u] * j+1 and its label is 2 + ([u] j ) − ([u] j+1 ), so if we want Right(u, j) to be non-empty, the time j must be a down step of u.If j is a left turn and a down step for u, the subtree {w ∈ T , whose labeling starts at ([u] * j+1 ).Hence thanks to (4) we get Since the labels along the ancestral line of u cannot increase by more that one at each step, if Combining these inequalities, we obtain We can now bound E 4 [#G 2 n ] from above: The lower bound of Theorem 1.1 directly follows from the second moment method : Using (1) and (11) we get the existence of c 1 > 0 such that Upper Bound.We will first provide estimates on the number of down steps of a fixed path u ∈ {0, 1} n .Recall that L(u), R(u) and D(u) respectively denote the left turns, right turns, and down steps times of u.

Lemma 2.2.
There exists a constant c 3 > 0 such that, for every n 0 and every u 0 ∈ {0, Proof.We use the notation of Section 2.1.For any set A ⊂ {0, 1, . . ., n − 1} and m ∈ {0, 1, . . ., n − #A}, with the notation N A n = #{j ∈ {0, 1, . . ., n − 1}\A : X j = 5} we have from [1, formula (27)] We will first obtain crude estimates for N A n .Note that N A n N ∅ n and that sup i 1 P 2 (i, 5) = 1 5 , so that for any B ⊂ {0, 1, . . ., n} we have By summing this bound over all choices of B with #B m we get P (N ∅ n m) 2 n 5 −m for every m ∈ {0, 1, . . .n}.Let κ 1 ∈ (0, 1/2) and κ 2 ∈ (0, 1) such that κ 1 + κ 2 < 1.We have Notice that for every A > 1 we can choose κ 1 > 0 small enough so that n n κ 1 n A n for n large enough.Furthermore 7 6 , and by choosing κ 1 even smaller if necessary we can ensure that the right hand side of ( 13) is bounded by c −1 3 2 −n exp(−c 3 n) for some c 3 > 0. We use the last lemma to deduce that We now argue on the event . On this event there exists a path u ∈ G n with at least c 3 n/2 down steps which are also left turns.Conditionally on this event we consider the left-most path P of G n satisfying these properties, that is A moment's thought shows that conditionally on P and on the values of the labels along the ancestral line of P , the subtrees of hanging on the right-hand side of P , that are the offsprings of the points [P ] * j+1 for j ∈ L(P ), are independent and distributed as labeled trees started at ([P ] * j+1 ).Hence conditionally on P and on the labels (( ([P ] i ), 0 i n), for any j ∈ L(P ) ∩ D(P ) the expected number of paths belonging to the set Right(P, j) (defined in the proof of the lower bound) is where κ 3 is a positive constant independent of n whose existence follows from (10).Thus we have we can use (1) to obtain P 4 (E L ) κ 4 /n for some constant κ 4 > 0. By a symmetry argument, the same bound holds for the event c 3 n/2}.Since {G n = ∅} is the union of the events E R , E L and {∃u ∈ G n , # D(u) c 3 n}, we easily deduce the upper bound of the theorem from the previous considerations and (14).

Extensions
Fix k 2. We can extend the recursive construction presented in the introduction by throwing polygons instead of chords: This will yield an analogue of the multitype branching process on the full k-ary tree.Formally if x 1 , . . ., x k are k (distinct) points of S 1 we denote by Pol(x 1 , . . ., x k ) the convex closure of {x 1 , . . ., x k } in D. Let (U i,j : 1 j k , i 1) be independent random variables that are uniformly distributed over S 1 .We construct inductively a sequence L k 1 , L k 2 , . . . of random closed subsets of the closed unit disk D. To start with, ).Then at step n + 1, we consider two cases.Either the polygon P n+1 := Pol(U n+1,1 , . . ., U n+1,k ) intersects L k n , and we put L k n+1 = L k n .Or the polygon P n+1 does not intersect L k n , and we put L k n+1 = L k n ∪ P k .Thus, for every integer n 1, L k n is a disjoint union of random k-gons.In a way very similar to what we did in the introduction we can identify the genealogy of the fragments appearing during this process with the complete k-ary tree Then the number of ends of the fragments created during this process gives a labeling k of T k whose distribution can be described inductively by the following branching mechanism (this is an easy extension of [ possible choices, and we assign the labels m 1 + 1, m 2 + 1, . . ., m k + 1 to the children of ∅.Again the distribution of the labeling k of T k obtained if we use the above branching mechanism but started from a 0 at the root will be denoted by P a and its expectation by E a .We use the same notation as in the binary case and are interested in a similar question: For which a 0 does there exist with positive probability a ray u such that k ([u] i ) a for every i 0? Specifically, the value a is called subcritical for the process ( k (u), u ∈ T k ) when there exists a constant c > 0 such that It is called supercritical when there exists a constant c > 0 such that we have both Note that a deterministic argument shows that if k 2 and a = 2, there always exists a ray with labels greater than or equal to 2, also when k = 2 and a = 3 there exists a ray with labels greater than 3.The case k = 2 and a = 4 has been treated in our main theorem.We have the following classification of all remaining cases: Theorem 3.1.We have the following properties for the process k , • for k = 2 and a 5 the process is subcritical, • for k = 3 the process is subcritical for a 4, and supercritical for a = 3, • for k 4 and a 3 the process is subcritical.
Proof.Supercritical Case k = 3 and a = 3.We will prove that for k = 3 and a = 3, the process is supercritical.Similarly as in Section 2.1 we consider the tree-indexed process 3 on a fixed ray of T 3 , say {0, 0, 0, . ..}.Then the process Y n given by the n-th value of 3 started from 3 along this ray is a homogeneous Markov chain with transition matrix given by We introduce the stopping times T i = inf{n 0, Y n = i} for i = 1, 2 and set T = T The largest eigenvalue λ max of this matrix is greater than 0.34, which implies that for some constant κ 5 > 0 independent of n.It follows that the expected number of paths starting at the root ∅ of T 3 that have labels greater than or equal to 3 up to level n, which is 3 n P (T > n), eventually becomes strictly greater that 1: There exists n 0 1 such that P 3 (T > n) > 3 −n for n n 0 .A simple coupling argument shows that the process 3 started from a 3 stochastically dominates the process 3 started from 3. Consequently, if we restrict our attention to the levels that are multiple of n 0 and declare that v is a descendant of u if along the geodesic between u and v the labels of 3 are larger than 3, then this restriction stochastically dominates a supercritical Galton-Watson process.Hence the value 3 is supercritical for 3 .Subcritical Case k = 3 and a = 4.As in the binary case we let be the substochastic matrix of the process Y n started at 4 and killed when it hits 1, 2 or 3. We will construct a positive vector (h(x)) x 4 such that x h(x) < ∞ and for some positive λ < 1/3, where we use the notation h • P3 (y) = x h(x) P3 (x, y).This will imply that where T is the first hitting time of {1, 2, 3} by the process Y n started at 4. The subcriticality of the case k = 3 and a = 4 follows from the preceding bound since there are 3 n paths up to level n and λ < 1/3.To show the existence of a positive vector x satisfying (17) we begin by studying the largest eigenvalue of a finite approximation of the infinite matrix P3 .To be precise let P (30) 3 = ( P3 (i, j)) 4 i,j 30 .A numerical computation with Maple c gives λ max := max Eigenvalues( P (30) 3 ) 0.248376642883065 < 1/3.
The vector (h(x)) x 4 is then constructed as follows.Let (h(x)) 4 x 30 be an eigenvector associated with the largest eigenvalue λ max of P (30)

3
, such that min 4 x 30 h(x) = h(30) = 1.Note that the vector h can be chosen to have positive coordinates by the Perron-Frobenius theorem and it is easy to verify that x → h(x) is decreasing.For x 31 we then let h(x) = 13 x−30 30! x!
Now, if y 31 we have Other critical cases.The other critical cases are treated in the same way.We only provide the reader with the numerical values of the maximal eigenvalues of the truncated substochastic matrices that are very good approximations of the maximal eigenvalues of the infinite matrices, max{eigenvalues(P 2 (i, j)) 5 i,j 30 } 0.433040861268365 < 1/2, max{eigenvalues(P 4 (i, j)) 3 i,j 30 } 0.231280689028977 < 1/4.

Figure 1 :
Figure 1: An illustration of the process creating the sequence (L n ) n 1 .We use hyperbolic chords rather than Euclidean chords for aesthetic reasons.

Figure 2 :
Figure2: On the left-hand side, the first 7 chords of the splitting process.On the right-hand side, the associated branching process corresponding to the number of ends of the fragments at their creations.Notice that we split the fragments according to the order of appearance of the chords, thus the binary tree on the right-hand side seems stretched.

Figure 3 :
Figure 3: Extension of the process (L n ) n 1 where we throw triangles or squares instead of chords.

1 ,
Lemma 5.5]): For u ∈ T k with label m 0 we choose a decomposition m = m 1 + m 2 + . . .+ m k with m 1 , m 2 , . . ., m k ∈ {0, 1, . . ., m}, uniformly at random among all m+k−1 k−1 1 ∧ T 2 .We consider a modification of the process Y n that we denote Y n , which has the same transition probabilities as Y n on {1, 2, 3, 4} , but the transition between 4 and 5 for Y n is replaced by a transition from 4 to 4 for Y n .Thus we have Y n 4 and an easy coupling argument shows that we can construct Y n and Y n simultaneously in such a way that Y n Y n for all n 0. Hence we have the following stochastic inequality