Abstract
Very sparse random graphs are known to typically be singular (i.e., have singular adjacency matrix) due to the presence of “low-degree dependencies” such as isolated vertices and pairs of degree 1 vertices with the same neighborhood. We prove that these kinds of dependencies are in some sense the only causes of singularity: for constants and , an Erdős–Rényi random graph with n vertices and edge probability typically has the property that its k-core (its largest subgraph with minimum degree at least k) is nonsingular. This resolves a conjecture of Vu from the 2014 International Congress of Mathematicians, and adds to a short list of known nonsingularity theorems for “extremely sparse” random matrices with density . A key aspect of our proof is a technique to extract high-degree vertices and use them to “boost” the rank, starting from approximate rank bounds obtainable from (nonquantitative) spectral convergence machinery due to Bordenave, Lelarge, and Salez.
Citation
Asaf Ferber. Matthew Kwan. Ashwin Sah. Mehtaab Sawhney. "Singularity of the k-core of a random graph." Duke Math. J. 172 (7) 1293 - 1332, 15 May 2023. https://doi.org/10.1215/00127094-2022-0060
Information