A local limit theorem for the critical random graph

We consider the limit distribution of the orders of the k largest components in the Erd˝os-Rényi random graph inside the “critical window” for arbitrary k . We prove a local limit theorem for this joint distribution and derive an exact expression for the joint probability density function.


Introduction
The Erdős-Rényi random graph G(n, p) is a random graph on the vertex-set [n] := {1, . . ., n}, constructed by including each of the n 2 possible edges with probability p, independently of all other edges.We shall be interested in the Erdős-Rényi random graph in the so-called critical window.That is, we fix λ ∈ and for p we take For v ∈ [n] we let (v) denote the connected component containing the vertex v. Let | (v)| denote the number of vertices in (v), also called the order of (v).For i ≥ 1 we shall use i to denote the component of i th largest order (where ties are broken in an arbitrary way), and we will sometimes also denote 1 by max .
It is well-known that, for p in the critical window (1.1), where C λ 1 , . . ., C λ k are positive, absolutely continuous random variables whose (joint) distribution depends on λ.See [1,3,4,5] and the references therein for the detailed history of the problem.In particular, in [5], an exact formula was found for the distribution function of the limiting variable C λ 1 , and in [1], it was shown that the limit in (1.2) can be described in terms of a certain multiplicative coalescent.The aim of this paper is to prove a local limit theorem for the joint probability distribution of the k largest connected components (k arbitrary) and to investigate the joint limit distribution.While some ideas used in this paper have also appeared in earlier work, in particular in [3,4,5], the results proved here have not been explicitly stated before.Before we can state our results, we need to introduce some notation.For n ∈ and 0 ≤ p ≤ 1, n,p will denote the probability measure of the Erdős-Rényi graph of size n and with edge probability p.For k ∈ and x 1 , . . ., x k , λ ∈ , we shall denote (1. 3) It has already been shown implicitly in the work of Łuczak, Pittel and Wierman [4] that this limit exists and that F k is continuous in all of its parameters.In our proof of the local limit theorem below we will use that F 1 (x; λ) is continuous in both parameters, which can also easily be seen from the explicit formula (3.25) in [5].
We will denote by C(m, r) the number of (labeled) connected graphs with m vertices and r edges and for l ≥ −1 we let γ l denote Wright's constants.That is, γ l satisfies Here l is even allowed to vary with k: ).Moreover, the constants γ l satisfy (see [7,8,9]): as l → ∞. (1.5) By G we will denote the Laurent series Note that by (1.5) the sum on the right-hand side is convergent for all s = 0.By a striking result of Spencer [6], G equals s −1 times the moment generating function of the scaled Brownian excursion area.For x > 0 and λ ∈ , we further define x 2π e −λ 3 /6+(λ−x) 3 /6 . (1.7) The main result of this paper is the following local limit theorem for the joint distribution of the vector (| where, for all x 1 ≥ • • • ≥ x k > 0 and λ ∈ , and where 1 ≤ m ≤ k is the number of distinct values the x i take, r 1 is the number of repetitions of the largest value, r 2 the number of repetitions of the second largest, and so on. Theorem 1.1 gives rise to a set of explicit expressions for the probability densities f k of the limit vectors (C λ 1 , . . ., C λ k ) with respect to k-dimensional Lebesgue measure.These densities are given in terms of the distribution function F 1 by the following corollary of Theorem 1.1: Corollary 1.2 (Joint limiting density for largest clusters).For any k > 0, λ ∈ and (1.10) Implicit in Corollary 1.2 is a system of differential equations that the joint limiting distributions must satisfy.For instance, the case In general this differential equation has many solutions, but we will show that there is only one solution for which x → F 1 (x; λ) is a probability distribution for all λ.This leads to the following theorem: Theorem 1.3 (Uniqueness of solution differential equation).The set of relations (1.10) determines the limit distributions F k uniquely.

Proof of the local limit theorem
In this section we derive the local limit theorem for the vector (| 1 |, . . ., | k |) in the Erdős-Rényi random graph.We start by proving a convenient relation between the probability mass function of this vector and the one of a typical component.

Lemma 2.1 (Probability mass function of largest clusters
be the number of distinct values the l i take, and let r 1 be the number of repetitions of the largest value, r 2 the number of repetitions of the second largest, and so on up to r m .Then ) Proof.For A an event, we denote by I(A) the indicator function of A. For the graph G(n, p), let E k be the event that because if E k and | k+1 | < l k both hold then there are exactly is the same for every vertex v, it follows by taking expectations on both sides of the previous equation that Next we observe, by conditioning on (1), that Combining (2.4) and (2.5), we thus get The relation (2.1) now follows by a straightforward induction argument.To see that (2.2) holds, notice that Proceeding analogously as before leads to (2.2).

Lemma 2.2 (Scaling function cluster distribution). Let β > α and b
Proof.For convenience let us write k := ⌊x n 2/3 ⌋ and p = p λ (n), with a ≤ x ≤ b and α ≤ λ ≤ β arbitrary.Throughout this proof, o(1) denotes error terms tending to 0 with n uniformly over all x, λ considered.First notice that (2.9)

.10)
Next we use the expansion 1 ) for each factor on the left of the following equation, to obtain (2.11) Using that k = ⌊x n 2/3 ⌋, combining (2.9)-(2.11)and substituting (1.4) leads to where Clearly, Lemma 2.2 follows from (2.12) if we can show that in the limit n → ∞, R(n, k) tends to 0 uniformly over all x, λ considered.
To show this, we recall that by [2, Corollary 5.21], there exists an absolute constant c > 0 such that where we have used that k 3/2 p/(1 − p) is bounded uniformly by a constant, and the last inequality holds for n sufficiently large.Hence R(n, k) = o(1), which completes the proof.
As n → ∞, ).Note that g n (x, λ) is nondecreasing in x and non-increasing in λ.By definition (1.3) of F 1 , there exists an n 0 = n 0 (ǫ) such that for all n ≥ n 0 , then for all n ≥ n 0 , and likewise Proof of Theorem 1.1.We start by introducing some notation.Fix a Finally, for i = 1, . . ., k let y i = y i (n) be chosen such that ⌊ y i m and more directly y i = x i + o (1), where the error terms o( 1) are uniform over all choices of the x i in [a, b].Throughout this proof, the notation o(1) will be used in this meaning.
Note that for all sufficiently large n, the y i are all contained in a compact interval of the form [a − ǫ, b + ǫ] for some 0 < ǫ < a, and the λ i are also contained in a compact interval.Hence, since l i+1 = ⌊ y i+1 m 2/3 i ⌋, it follows from Lemma 2.2 that for i = 0, . . ., k − 1, 1). (2.20) But because Φ(x; λ) is uniformly continuous on a compact set, the function on the right tends uniformly to x i+1 Φ x i+1 ; λ − j<i x j .We conclude that 1). (2.21) Similarly, using that F 1 is uniformly continuous on a compact set, from Lemma 2.3 we obtain By Lemma 2.1, we see that we are interested in the product of the left-hand sides of (2.21) and (2.22).Since the right-hand sides of these equations are bounded uniformly over the x i considered, it follows immediately that To complete the proof, set l k+1 = l k , and note that, by Lemma 2.1 and (2.21), is the sum of the left-hand sides of (2.23) and (2.24), this completes the proof of Theorem 1.1.

Proof of Corollary 1.2. For any
and notice that g n is then a probability density with respect to k-dimensional Lebesgue measure.Let X n = (X 1 n , . . ., X k n ) be a random vector having this density, and define the vector Y n on the same space by setting Y n = ⌊X 1 n n 2/3 ⌋n −2/3 , . . ., ⌊X k n n 2/3 ⌋n −2/3 .Then Y n has the same distribution as the vector (| 1 |n −2/3 , . . ., | k |n −2/3 ) in G n, p λ (n) .Now recall that by [1, Corollary 2], this vector converges in distribution to a limit which lies a.s. in (0, ∞) k .Let P λ be the law of the limit vector.Since |X n − Y n | → 0 almost surely, P λ is also the weak limit law of the X n .By Theorem 1.1, g n converges pointwise to Ψ k ( • ; λ) on (0, ∞) k , and hence Ψ k ( • ; λ) is integrable on (0, ∞) k by Fatou's lemma.Now let A be any compact set in (0, ∞) k .Then g n converges uniformly to Ψ k ( • ; λ) on A, so we can apply dominated convergence to see that

Unique identification of the limit distributions
In this section we will show that the system of differential equations (1.10) identifies the joint limiting distributions uniquely.Let us first observe that it suffices to show that there is only one solution to the differential equation such that x → F 1 (x; λ) is the distribution function of a probability distribution for all λ ∈ .In the remainder of this section we will show that if F 1 satisfies (3.1) and x → F 1 (x; λ) is the distribution function of a probability distribution for all λ ∈ then F 1 can be written as where ϕ(x) = G(x 3/2 ) x 2π.This will prove Theorem 1.3 by our previous observation.To this end, we first note that it can be seen from Stirling's approximation and (1.5) that G(s) = exp s 2 /24 + o(s 2 ) as s → ∞, so that for all λ ∈ .To prove (3.2), we will make use of the following bound: Lemma 3.1.Let a > δ > 0, λ ∈ and k > λ/δ, and write ϕ(x) = G(x 3/2 ) x 2π.Denote by d k x integration with respect to x 1 , . . ., x k .Then
Proof of Theorem 1.3.Applying (3.1) twice, we see that and repeating this m − 2 more times leads to From Lemma 3.1 we see that for any ǫ > 0 we can choose m = m(ǫ) such that where we have used that F 1 ≤ 1. Hence (3.2) follows from (3.9) and Lemma 3.1.

Discussion
We end the paper by mentioning a possibly useful extension of our results.Recall that the surplus of a connected component is equal to the number of edges in minus the number of vertices plus one, so that the surplus of a tree equals zero.There has been considerable interest in the surplus of the connected components of the Erdős-Rényi random graph (see e.g.[1,3,4] and the references therein).For example, in [1] and with σ n (k) denoting the surplus of k , it is shown that where o(1) now is uniform in σ 1 , . . ., σ k and in x 1 , . . ., x k satisfying a ≤ x 1 ≤ • • • ≤ x k ≤ b for some 0 < a < b, and where we define with x 2π e −λ 3 /6+(λ−x) 3 /6 .(4.4) .16) Proof.Fix ǫ > 0. Recall that F 1 is continuous in both arguments, as follows for instance from [5, (3.25)].Therefore, F 1 is uniformly continuous on [a, b] × [α, β], and hence we can choose a