Strong and Weighted Matchings in Inhomogenous Random Graphs

We equip the edges of a deterministic graph $H$ with independent but not necessarily identically distributed weights and study a generalized version of matchings (i.e. a set of vertex disjoint edges) in $H$ satisfying the property that end-vertices of any two distinct edges are at least a minimum distance apart. We call such matchings as strong matchings and determine bounds on the expectation and variance of the minimum weight of a maximum strong matching. Next, we consider an inhomogenous random graph whose edge probabilities are not necessarily the same and determine bounds on the maximum size of a strong matching in terms of the averaged edge probability. We use local vertex neighbourhoods, the martingale difference method and iterative exploration techniques to obtain our desired estimates.


Introduction
Matchings in graphs is an important object of study from both theoretical and application perspectives. One of the most well-studied aspect of matchings from the probabilistic perspective is that of minimum weight matchings [1] : Given a complete bipartite graph on n + n vertices and equipping each edge with an independent exponential weight, the problem is to determine the minimum weight C(n) of a perfect matching. It is well known [10,11] that EC(n) = n k=1 1 k 2 and later [12] obtained estimates for the expected minimum weight of a matching of given size. Recently [7] studied minimum weight matchings in random graphs and used the Talagrand concentration inequalities to get the corresponding deviation estimates.
In the first part of our paper, we study minimum weight of a strong matchings where end-vertices of distinct edges are at a given minimum distance apart. We equip the edges of a deterministic graph H with independent but not necessarily identically distributed weights and use local neighbourhood estimates to obtain bounds on the expectation and variance of the minimum weight of a maximum strong matching. We also determine sufficient conditions so that the minimum weight grows linearly with the strong matching number. * Institute of Mathematical Sciences, HBNI, Chennai. E-mail: gganesan82@gmail.com The second main result of our paper is regarding the strong matching number of inhomogenous random graphs where the edge probabilities need not be the same. Matchings in homogenous random graphs where edge probabilities are the same, have been studied extensively and bounds for the matching number are known for a wide range of the edge probability [4]. Any lower bound on the matching number for homogenous graphs can be extended to inhomogenous graphs whose edge probabilities are bounded from below, by the following monotonicity property: If H 1 ⊆ H 2 are two graphs, then a matching in H 1 is also a matching in H 2 .
Induced matchings have been studied in [5] (see also references therein) under the name of strong matchings where estimates for the expected size of a maximum induced matching in homogenous graphs with constant edge probability is obtained. Recently [6] used a combination of second moment method along with concentration inequalities to estimate the largest possible size of induced matchings in homogenous graphs and obtained deviation bounds for a wide range of edge probabilities. A main bottleneck in directly extending the above results to inhomogenous graphs is that induced matchings do not satisfy the monotonicity property enjoyed by the ordinary matchings described in the previous paragraph. In this paper, we study generalized strong matchings in inhomogenous random graphs and use local neighbourhood bounds described earlier, to estimate the strong matching number in terms of the averaged edge probabilities.
The paper is organized as follows: In Section 2, we define strong matchings in graphs and state and prove our first main result, Theorem 2.2, regarding the minimum weight of a strong matching in a graph equipped with inhomogenous weights. Next in Section 3, we state and prove Theorem 3.1 regarding the maximum size of a strong matching in inhomogenous random graphs.

Weighted Strong Matchings
Let H = (V, E) be any graph containing at least one edge. A path π in H is a sequence of edges (e 1 , . . . , e l ) such that e i = (a i , b i ) and e i+1 = (a i+1 , b i+1 ) share a common endvertex b i = a i+1 for 1 ≤ i ≤ l − 1. The length of π is equal to l, the number of edges in π and the vertices a 1 and b l are said to be connected by the path π. The distance between two vertices a and b is defined to be the minimum length of a path connecting a and b.
A set of vertex disjoint edges W = {e 1 , . . . , e l } in H is said to be a matching of size l. Definition 2.1. Let W = {e 1 , . . . , e l } be a matching of a graph H. For integer k ≥ 0, we say that W is a k−strong matching if the following property holds for any pair of edges e i = e j : There does not exist a path π in H containing l ≤ k edges that connects an endvertex of e i with an endvertex of e j . The k−strong matching number ν k (H) of the graph H is the size of a maximum k−strong matching in H.
For k = 0 the above definition reduces to the usual definition of matching as described prior to Definition 2.1 and for k = 1 this coincides with the concept of induced/strong matchings described in the introduction. We retain the terminology strong matchings and introduce the term k as a further generalization.
We now equip each edge of H with random weights and estimate the minimum weight of a strong matching. Let w(e) ≥ 0 be the random weight assigned to the edge e ∈ H. The edge weights are independent but not necessarily identically distributed and we define M k (H) to be the minimum weight of a maximum k−strong matching of H. We have the following properties regarding the mean and variance of the minimum weight M k (H). We say that an edge e = (u, v) of a graph H is isolated if no other edge in H shares an endvertex with H.
for each x > 0 and each edge e (2.3) and where C > 0 is a positive constant that depends only on ∆, k, F 1 and F 2 .
We have the following remarks: 2) we see that the minimum weight of a strong matching is concentrated around its mean. Remark 2 : The bounded degree condition is important since in the proof of (2.4) below, we see that the constant C is such that C −→ 0 as the maximum vertex degree ∆ −→ ∞.

Proof of Theorem 2.2
Proof of Theorem 2.2 (a): Let W be any deterministic maximum k−strong matching of H containing ν k (H) edges and let M k (W) = e∈W w(e) be the weight of W. We then have that EM k (H) ≤ e∈W Ew(e) ≤ µ · ν k (H) where µ is as in the statement of the Theorem. This proves the first bound in (2.2).
For the variance bound, we use the martingale difference method analogous to [9].
Let e 1 , . . . , e q be the set of edges of the graph H and for 1 ≤ j ≤ q, let F j = σ ({w(e k )} 1≤k≤j ) denote the sigma field generated by the weights of the edges {e k } 1≤k≤j . We define the martingale difference ) has the following terminology: the random variable t(e j ) is an independent copy of w(e j ) which is also independent of {w(e l )} 1≤l =j≤q and M k (w(e j )) and M k (t(e j )) are the weights of the minimum weight maximum k−strong matchings W and T obtained respectively with edge weights {w(e k )} 1≤k≤q and {w(e k )} 1≤k =j≤q ∪ {t(e j )}.
From the above paragraph we get that ECP 0 (2020), paper 0. and in what follows we estimate |Q(e j )|. If the weight of the edge e j is decreased from w(e j ) to t(e j ) < w(e j ), then the corresponding minimum weight M k (t(e j )) ≤ M k (w(e j )) and the difference M k (w(e j )) − M k (t(e j )) ≤ w(e j ) − t(e j ) ≤ w(e j ). Also we have that M k (w(e j )) − M k (t(e j )) is non-zero if and only if e j ∈ T or e j ∈ W. But be- and arguing similarly for the case t(e j ) > w(e j ), we get that |Q(e j )| ≤ w(e j )1 1(e j ∈ T ) + t(e j )1 1(e j ∈ W). (2.7) and taking conditional expectations with respect to the sigma field F j we get and The final equality in (2.9) is true since the random variable w(e j ) is independent of the event e j ∈ T , which is determined by the edge weights {w(e k )} k =j ∪ t(e j ). Similarly the final relation in (2.10) follows from the fact that the event {e j ∈ W} ∈ F j is independent of t(e j ).
Taking conditional expectations with respect to F j−1 in (2.9) and (2.10) we get that P (e j ∈ T | F j−1 ) and taking expectations we get E(ζ 2 j ) ≤ 4Ew 2 (e j )P (e j ∈ T ) ≤ 4µ 2 P (e j ∈ T ) where µ 2 is as in the statement of the Theorem. Summing over j and using (2.5) we then get var(M k (H)) ≤ 4µ 2 j P (e j ∈ T ) = 4µ 2 ν k (H), (2.11) since the expected number of edges in any maximum k−strong matching is ν k (H).
Proof of Theorem 2.2 (b): We first show that there are positive constants γ 1 and γ 2 depending only on ∆, k, µ, F 1 and F 2 such that since m ≥ 1 and this obtains the desired lower bound in (2.4).
To prove (2.12), we use a combinatorial result (Lemma 3.2 in Appendix) that obtains bounds for the strong matching number in terms of size of local neighbourhoods. Say that e ∈ H is a bad edge if its weight w(e) ≤ γ for some constant γ > 0 to be determined later. Letting T L = e 1 1(w(e) ≤ γ) be the total number of bad edges, we have that and therefore that (2.14) Suppose now that the event T L ≤ 3 2 m·F 2 (γ) occurs and let U be a maximum k−strong matching of minimum weight. There are at most T L bad edges in U and so

Inhomogenous Random Graphs
In this section, we study strong matching numbers of random graphs obtained as follows. Let K n be the complete graph on n vertices and let {X e } e∈Kn be independent random variables indexed by the edge set of K n and having distribution P(X e = 1) = p(e) = 1 − P(X e = 0) for the edge e. Let G be the random graph formed by the set of all edges e satisfying X e = 1. Because the edge probabilities p(e) need not all be the same, we define G to be an inhomogenous random graph. If p(e) = p for all e, then G is said to be a homogenous random graph with edge probability p.
In words, we get that with high probability, i.e. with probability converging to one as n → ∞, the maximum size of a k−strong matching grows as a power of n. Moreover, our bounds for the strong matching are in terms of the averaged edge probability parameter δ av and δ low . In particular if δ low = δ av = δ, then D1·n with high probability, where we recall that k 1 is roughly equal to k 2 . Extrapolating the results of [6] for induced matchings where k = 1, we conjecture that ν k (G) is in fact of the order of n(log n) a n k(1−β+δ) with high probability, for some constant a > 0.
The proof outline for Theorem 3.1 is as follows. We use local vertex neighbourhood bounds for the strong matching number obtained in Lemma 3.2 of Appendix together with variance estimates for weighted random graphs described in Theorem 2.2, to obtain our lower bound in Theorem 3.1. For the upper bound on the strong matching number in Theorem 3.1, we use iterative exploration techniques analogous to [3] to first estimate the size of local neighbourhoods of vertices. We then employ an upper bound for the strong matching number again based on local neighbourhoods, obtained in Lemma 3.2 of Appendix, to complete the proof of Theorem 3.1.

Proof of the lower bound in (3.4) : We use the estimate (A.1) of Lemma 3.2 which states that
provided G contains no isolated edge. Here d j (u) is the number of vertices at a distance at most j from u. The expected number of neighbours of any vertex in G is at least γ1(n−1) n β−δ low and so from the deviation estimate (A.4) of Lemma 3.3 in Appendix, we see that each vertex has degree at least γ1 2 · n 1−β+δ low with probability at least 1 − e −2Cn 1−β+δ low for some constant C > 0. Therefore if E iso denotes the event that G contains no isolated edge, then by the union bound we have that P(E iso ) ≥ 1 − n · e −2Cn 1−β+δ low ≥ 1 − e −Cn 1−β+δ low for some constant C > 0.
In what follows we find an upper bound for the denominator term u∈V d 1 (u)d k+1 (u) in (3.5). Letting u ∼ v denote that vertex u is adjacent to vertex v, we get that the degree of u equals d 1 (u) = v =u 1 1(u ∼ v), where 1 1(.) is the indicator function. To compute d k+1 (u), we use the fact that if a vertex z is at a distance l from u, then there is a path of length l containing u as an endvertex. Therefore u, w 1 , . . . , w j ). (3.8) where respectively.
Define a walk to be a sequence of edges (e 1 , . . . , e t ) where e i and e i+1 share a common endvertex for 1 ≤ i ≤ t − 1. The term f j = f (v, u, w 1 , . . . , w j ) is a product of exactly j + 1 distinct terms if v = w 1 , since in this case the walk W = (v, u, w 1 , . . . , w j ) contains the path (u, w 1 , . . . , w j ) formed by j edges and the additional edge (v, u). The total number of vertices in the walk W is either j + 2 or j + 1 depending on whether v ∈ {w 2 , . . . , w j } or not. In any case, the expectation where the first inequality in (3.11) is true by the AM-GM inequality. Substituting (3.11) into (3.8) we then get that The number of walks containing any edge (u, v) and having l ≤ j + 2 vertices is at most C 1 · n j for some constant C 1 > 0 and so each edge of the complete graph K n is counted at most C 1 · n j times in the inner double summation in (3.12). Thus (3.13) and because j + 1 ≤ k + 2 and the edge probability "weights" h(e) ≥ γ 1 (see (3.2)), we also have that for some positive constants C 2 , C 3 , by the condition (3.2). Therefore (3.14) for some constants C 4 , C 5 > 0.
Following an analogous analysis, we get that V k (n) satisfies (3.14) as well and so J k (n) ≤ C 6 · n (k+2)(1−β+δav )+1 (3.15) for some constant C 6 > 0. By Markov inequality P ( u d 1 (u)d k+1 (u) ≥ 2J k (n)) ≤ 1 2 and so by (3.15), the sum u d 1 (u)d k+1 (u) ≤ 2J k (n) ≤ 2C 6 ·n (k+2)(1−β+δav )+1 with probability at least 1 2 . Together with (3.6) and (3.7), we get from (3.5) that  var (ν k (G)) ≤ 4Eν k (G). We use an exploration technique analogous to [3]. Let S 0 = {1} and for i ≥ 1 let S i be the set of vertices that are at a distance i from the vertex 1. Given S 0 , S 1 , . . . , S i−1 = {v 1 , . . . , v L }, we would like to estimate the size of S i . Let T 0 = i−1 j=0 S j and define the sets T l , 1 ≤ l ≤ L iteratively as follows. Let N 1 be the set of neighbours of v 1 in {1, 2, . . . , n} \ T 0 and set T 1 = T 0 ∪ N 1 . Iteratively, let N j be the set of neighbours of v j in the set {1, 2, . . . , n} \ T j−1 and set T j = T j−1 ∪ N j . Define p low := γ 1 n β−δ low and p up := γ 2 n β−δup (3.19) where the constants γ i , i = 1, 2 and δ low , δ up are as in (3.2). Each edge probability lies between p low and p up and so if E j denotes the event that then using the standard Binomial deviation estimate (A.4) in Lemma 3.3 of Appendix with ǫ = 1 2 , we get that P(E j | T j−1 ) ≥ 1 − e −C(n−#Tj−1)p low for some absolute constant C > 0 not depending on the choice of T j−1 . Thus If j−1 l=1 E l occurs, then T j−1 has size at most #T 0 + 2np up (j − 1) ≤ #T 0 + 2np up L and so from (3.21) we get that and taking expectations and setting j = L, we get that If L l=1 E l occurs, then from (3.20) we see that the number of vertices S i at a distance i from the vertex 1 satisfies for all n large, since i ≤ k 1 . Using the fact that (np up ) k1 ≤ C · n k1(1−β+δup) for some constant C > 0 by (3.19) and k 1 (1 − β + δ up ) < 1 strictly (see (3.3)), we get the final term in (3.25) is bounded above by n 8 for all n large and so from (3.21) we see that P L l=1 E l | T 0 1 1 Using the bound (3.26) iteratively, we then get that and because L ≤ n we also have that (1 − x) In other words, the event J i occurs. From the estimate (3.28) we therefore get that for some constant D > 0 not dependent on the choice of i. Proceeding iteratively, we therefore get   We prove the upper and lower bounds for ν k (H) in Lemma 3.2 in that order. Proof of (A.2) in Lemma 3.2: Let F = {(u 1 , v 1 ), . . . , (u t , v t )}, t = ν k (H) be a maximum k−strong matching in H. Let N j (u) be the set of all vertices at a distance at most j from u in the graph H and so letting k 1 be as defined in the statement of the Theorem, we have that . For any two vertices u i and u j belonging to distinct edges of the matching F , we must have that N k1 (u i ) N k1 (u j ) = ∅; because otherwise, there would be a path of length at most 2k 1 ≤ k − 1 connecting u i and u j . Therefore t · min u #N k1 (u) ≤ #V (H) = n and this obtains (A.2).
Proof of (A.1) in Lemma 3.2: Let H L be the line graph (pp. 71, [8] Also, throughout we use the following standard deviation estimate.