April 2024 Deformed semicircle law and concentration of nonlinear random matrices for ultra-wide neural networks
Zhichao Wang, Yizhe Zhu
Author Affiliations +
Ann. Appl. Probab. 34(2): 1896-1947 (April 2024). DOI: 10.1214/23-AAP2010

Abstract

In this paper, we investigate a two-layer fully connected neural network of the form f(X)=1d1aσ(WX), where Xd0×n is a deterministic data matrix, WRd1×d0 and aRd1 are random Gaussian weights, and σ is a nonlinear activation function. We study the limiting spectral distributions of two empirical kernel matrices associated with f(X): the empirical conjugate kernel (CK) and neural tangent kernel (NTK), beyond the linear-width regime (d1n). We focus on the ultra-wide regime, where the width d1 of the first layer is much larger than the sample size n. Under appropriate assumptions on X and σ, a deformed semicircle law emerges as d1/n and n. We first prove this limiting law for generalized sample covariance matrices with some dependency. To specify it for our neural network model, we provide a nonlinear Hanson–Wright inequality suitable for neural networks with random weights and Lipschitz activation functions. We also demonstrate nonasymptotic concentrations of the empirical CK and NTK around their limiting kernels in the spectral norm, along with lower bounds on their smallest eigenvalues. As an application, we show that random feature regression induced by the empirical kernel achieves the same asymptotic performance as its limiting kernel regression under the ultra-wide regime. This allows us to calculate the asymptotic training and test errors for random feature regression using the corresponding kernel regression.

Funding Statement

Z.W. is partially supported by NSF DMS-2055340 and NSF DMS-2154099. This material is based upon work supported by the National Science Foundation under Grant No. DMS-1928930 while Y.Z. was in residence at the Mathematical Sciences Research Institute in Berkeley, California, during the Fall 2021 semester for the program “Universality and Integrability in Random Matrix Theory and Interacting Particle Systems”. Y.Z. is partially supported by NSF-Simons Research Collaborations on the Mathematical and Scientific Foundations of Deep Learning.

Acknowledgments

Z.W. would like to thank Denny Wu for his valuable suggestions and comments. Both authors would like to thank Lucas Benigni, Ioana Dumitriu, and Kameron Decker Harris for their helpful discussion.

Citation

Download Citation

Zhichao Wang. Yizhe Zhu. "Deformed semicircle law and concentration of nonlinear random matrices for ultra-wide neural networks." Ann. Appl. Probab. 34 (2) 1896 - 1947, April 2024. https://doi.org/10.1214/23-AAP2010

Information

Received: 1 October 2021; Revised: 1 April 2023; Published: April 2024
First available in Project Euclid: 3 April 2024

MathSciNet: MR4728160
Digital Object Identifier: 10.1214/23-AAP2010

Subjects:
Primary: 60B20 , 68T07
Secondary: 62J07

Keywords: neural networks , neural tangent kernel , random feature regression , Random matrix theory

Rights: Copyright © 2024 Institute of Mathematical Statistics

JOURNAL ARTICLE
52 PAGES

This article is only available to subscribers.
It is not available for individual sale.
+ SAVE TO MY LIBRARY

Vol.34 • No. 2 • April 2024
Back to Top