Total number of births on the negative half-line of the binary branching Brownian motion in the boundary case

The binary branching Brownian motion in the boundary case is a particle system on the real line behaving as follows. It starts with a unique particle positioned at the origin at time $0$. The particle moves according to a Brownian motion with drift $\mu = 2$ and diffusion coefficient $\sigma^2 = 2$, until an independent exponential time of parameter $1$. At that time, the particle dies giving birth to two children who then start independent copies of the same process from their birth place. It is well-known that in this system, the cloud of particles eventually drifts to $\infty$. The aim of this note is to provide a precise estimate for the total number of particles that were born on the negative half-line, investigating in particular the tail decay of this random variable.


Introduction
A branching Brownian motion is a continuous-time particle system on the real line in which particles move according to independent Brownian motions and split at independent exponential times into children. These children then start independent copies of the branching Brownian motion from their birth place. In this article, we take interest in a binary branching Brownian motion, meaning that at each branching event, every particle splits into two daughter particles independently. We also assume the branching Brownian motion to be in the so-called boundary case (following [8]), i.e. that the Brownian motions driving the motion of the particles have drift µ = 2 and diffusion coefficient σ 2 = 2.
The branching Brownian motion can be constructed as a process decorating the infinite binary tree U := ∪ n∈Z+ {1, 2} n following the classical Ulam-Harris notation, with the convention {1, 2} 0 = {∅}. For each u ∈ U, we write b u and d u the birth-and death-times of the particle u, and for all s ≤ d u we denote by X s (u) the position at time s of the particle u or the position of its ancestor alive at that time.
For all t ≥ 0, let N t = {u ∈ U : b u ≤ t < d u } be the set of particles alive at time t. It is well-known that a branching Brownian motion in the boundary case satisfies local extinction and global survival properties. In other words, while N t is almost surely non-empty for all t ≥ 0, we have lim t→∞ #{u ∈ N t : X t (u) ∈ K} = 0 a.s. for all compact set K. More precisely, Bramson [11] obtained the precise asymptotic behaviour of the minimal position M t = min u∈Nt X t (u) occupied by a particle at time t, showing that M t = 3 2 log t + O P (1), (1.1) with O P (1) representing a tight family of random variables. Hence, for all x ∈ R, after some finite time, there will be no particle in the interval (−∞, x). The aim of this article is to study the law of the number N of birth (or death) events occurring on the negative half-line, defined as Precisely, we take interest in the right tail of the distribution of N , and we show that P(N ≥ n) ∼ 1 n as n → ∞. More generally, for all x ∈ R, we denote by N x the total number of birth event occurring below the level x (with N = N 0 ), that can be written as Remark that the random variable N x is related to, but different of, the number N x of births that occurred in the branching Brownian motion with absorption at level x, defined as N x = u∈U 1 {Xd(u)(u)≤x,∀s≤du,Xs(u)≤x} . (1.4) The quantity N x was introduced and studied by Kesten [18], who proved it to be a.s. finite if and only if the drift µ of the underlying Brownian motion is larger or equal to 2. Increasingly tight estimates were obtained on N both in the boundary and the non-boundary cases [1,5,20,2,7].
The process (N x , x ≥ 0) is a Markovian branching process, at least as long as the number of children created in a branching event is non-random. In that case N x is in one-to-one correspondence with the number Z x of individuals that hit level x for the first time 1 . Conversely, the process (N x , x ≥ 0) does not satisfy the Markov property, as particles that went above level x for some time, then back below that level and gave birth are taken into account. However, it is possible to link the values of N with N in such a way that the known tail of N x helps us compute the tail of N , see Lemma 3.1 below. The main result of the article is the following.
It entails in particular P(N ≥ n) ∼ 1/n as n → ∞. 1 And (Zx, x ≥ 0) is Markovian, as it can be seen by applying the branching property along a stopping line, see next section.
As a comparison, the estimate of Maillard [20, Theorem 1.1] on N can be written in this context: for all x > 0, Therefore, the tail of N is slightly heavier than the tail of N , which indicates that a non-trivial contribution to the tail of N comes from particles that cross 0 at least one time before giving birth to descendants on the negative half-line.

Remark 1.2. This theorem is equivalent to
In other words, the asymptotic behavior of the Laplace transform of N as λ → 0 is linked to the asymptotic behavior of P(N ≥ n) as n → ∞. We refer to [12,Lemma 8.3] for a proof of that equivalence. Remark 1.3. One could obtain an estimate similar to Theorem 1.1 for a branching Brownian motion with drift µ > 2. In this situation, N becomes integrable, but using the decomposition in Lemma 3.1 and straightforward adaptation of our arguments, one can obtain In the next section, we recall some useful estimates related to the branching Brownian motion. We then prove Theorem 1.1 in Section 3, by comparing the asymptotic behaviours as x → ∞ of N x and N x .

Stopping lines, branching random walk and the many-to-one lemma
We begin by introducing the derivative martingale of the branching Brownian motion, defined as D t := u∈Nt X t (u)e −Xt(u) . Lalley and Sellke [19] proved that the derivative martingale converges a.s. towards a non-degenerate limit which is a.s. positive. We then introduce optional stopping lines. Stopping line techniques were pioneered in [14,16]. Informally speaking, a stopping line is generalization of stopping time in the context of branching processes, such that different particles are stopped at different times. We take in particular interest in the following family of very simple cutting stopping lines Jagers [16] proved that branching processes stopped at L x satisfies the branching property, i.e. that each particle in L x starts from its time and position an independent copy of the branching Brownian motion, which is independent of σ ((X s (u), s ≤ t), (u, t) ∈ L x ).
We now associate to the branching Brownian motion the branching random walk of the birth places of particles, defined for all u ∈ U by V (u) = X u (d u ). where the sum over |u| = n is the sum over u ∈ {1, 2} n the set of particles in the nth generation. From the construction of the branching random walk, it is apparent that is a family of i.i.d. random variables with same law as √ 2B T + 2T , where πu is the parent of u, B is a standard Brownian motion and T an independent exponential random variable with parameter 1.
As a result, we deduce that (V (u), u ∈ U) is a branching random walk, a discrete-time particle system on the real line starting from V (∅), such that each parent particle gives birth to two daughter particles that are positioned around their parent according to i.i.d.
copies of V (∅). Observe that for all λ close enough to 0, we have Therefore the law of the displacement of the branching random walk V has the density For all a ∈ R, we write P a the law of V conditionally on V (∅) = a and E a the corresponding expectation. We next introduce the many-to-one lemma. This result has a long history going back to the work of Peyrière [21] and Kahane and Peyrière [17]. We refer to [22, Theorem 1.1] for a proof of this result. Lemma 2.1 (Many-to-one lemma). For any a ∈ R, n ≥ 1 and measurable function f : where (u 1 , . . . , u n ) is the ancestral line of u and (S n ) n≥0 is a random walk such that P a (S 0 = a) = 1, whose step distribution has density As an immediate consequence of the above lemma, we obtain that Therefore, V is a branching random walk in the boundary case, according to the terminology of [9]. We also note that i.e. the step distribution of the random walk (S n ) has unit variance.

Proof of Theorem 1.1
The proof of Theorem 1.1 is based on the following decomposition of the number N x of birth events below level x along the stopping line L x . Lemma 3.1. Let x ∈ R, we write Z x = #L x for the total number of particles that hit level x for the first time in their history. We have This equality in distribution allows us to link together the law of N with the laws of N x and Z x . We then determine the asymptotic behaviour as x → ∞ of N x and Z x , and use those to obtain estimates on the law of N .

Remark 3.2.
As the branching Brownian motion splits at each time in exactly 2 children at every branching event, and no particles stay forever below the level x > 0, the total number of particles hitting level x for the first time in their history satisfies Z x = N x + 1 < ∞ a.s., with N x the total number of births given by particles before their absorption at level x, defined in (1.4).
Proof. The above equality is an immediate consequence of the branching property applied at the stopping line L x . Each of the Z x particles of the stopping line starts an independent branching Brownian motion from level x, independently from the branching Brownian motion absorbed at level x.
As a consequence, the total number of births below level x is equal to the number of births below level x occurring before hitting x for the first time (which is equal to 0 if x < 0 or to N x = Z x − 1 if x > 0), summed with the total number of births below level x of all the branching Brownian motions started from L x , which are equal in distribution to sum of Z x independent copies of N 0 .
As mentioned in Remark 1.2, the proof of Theorem 1.1 relies on a tight computation of the asymptotic behaviour of the Laplace transform of N . For all x ∈ R and λ > 0, we set φ(λ, x) = log E e −λNx . In other words, the Laplace transform of N can be related to the Laplace transform of Z x the number of particles first hitting level x, or equivalently by Remark 3.2 to the Laplace transform of N x when x > 0.
To study the asymptotic behaviour of φ(λ, 0) as λ → 0, we show that normalized versions of Z x and N x both converge, as x → ∞, to multiples of the limit of the derivative martingale of the branching Brownian motion, defined in (2.1).
First, using stopping line techniques, we obtain an almost sure estimate for the growth rate of Z x as x → ∞.
We now turn to the asymptotic behaviour, as x → ∞, of Using estimates developed in Chen [15], we are able to obtain the following asymptotic behaviour for N x as x → ∞. This convergence can be thought of as a Seneta-Heyde type result for the additive martingale of the branching random walk V . A similar convergence was obtained in [15, Eq. (5.5)], using similar methods as the one pioneered by Boutaud and Maillard [10].
Precisely, for all 0 < a < b and Λ > 1, we have where φ(z) = ze −z 2 /2 and g(a, b) = P(sup s∈[0,1] R s ≤ a|R 1 = b), with R a Bessel process of dimension 3. Here we used [6, Lemma 2.1] to identify the constant 2 π . To complete the proof of Proposition 3.4, we use the following lemma, whose proof is postponed to the end of the article.   Let > 0, by (1.1), there exists α > 0 such that P 0 (inf u∈U V (u) ≤ −α) < . Additionally, using (3.4), (3.5) and the Markov inequality, we can fix A 1 , δ > 0 such that for all x ≥ A 1 , Up to decreasing δ, we assume as well that 0 ≤ c 0,∞ − c δ,δ −1 ≤ . Similarly, using (3.3), we may fix A 2 ≥ A 1 and Λ > 1 such that for all x ≥ A 2 , we have Up to enlarging again Λ, we also assume Finally, using the conver- As a result, for all x ≥ A 3 , chaining these equations we obtain proving that lim x→∞ e −x N x = c 0,∞ D ∞ in P 0 -probability, as D ∞ is a.s. finite. Next, using that V (∅) is independent of (V (u) − V (∅), u ∈ U), which has law P 0 , and that for all a ∈ R, the law of D ∞ under P a is the same as the law of e a D ∞ under law P 0 , we also obtain that lim x→∞ e −x N x = c 0,∞ D ∞ in P-probability.
Next, using Lemma 3.3 and Proposition 3.4, we are now able to prove Theorem 1.1.
We end this article with a proof of Lemma 3.5, which is based on the many-to-one lemma and random walk estimates.
Proof of Lemma 3.5. We prove each of the three limits in turn, using the ballot-type random walk estimates introduced in Section 3. (3.3). For all n ∈ N, we set S n = max k≤n S k . Let 0 < a < b, using the many-to-one lemma, we compute for all α, x > 0 and Λ > 1,

Proof of
We then bound P 0 S n ≥ −α, S n ≥ Λ(n/b) 1/2 , S n ∈ [h − 1, h] for large values of Λ, uniformly in h ≤ (n/a) 1/2 . Write T (n) = inf{k ∈ N : S k ≥ Λ(n/b) 1/2 }, we observe that, setting p = n/2 , and we bound these two probabilities in turn. Applying the Markov property at time p, we have using (2.5) and (2.6). Therefore, there exists C > 0 such that for all n ∈ N, we have Next, observing that (S n − S n−k , k ≤ n) = (S k , k ≤ n) by reversing time, for all 0 ≤ h ≤ (n/a) 1/2 , we have P 0 S n ≥ −α, ∃k ∈]p, n] : applying the Markov property at time n − p. Then applying (2.7) to the random walk −S, there exists C > 0 such that and by Donsker's invariance principle,  Proof of (3.5). Using the many-to-one lemma, we have √ δ x which is o δ (1)(1 + α) as δ ↓ 0, uniformly in x ≥ 1.