February 2024 Convergence rates for shallow neural networks learned by gradient descent
Alina Braun, Michael Kohler, Sophie Langer, Harro Walk
Author Affiliations +
Bernoulli 30(1): 475-502 (February 2024). DOI: 10.3150/23-BEJ1605

Abstract

In this paper we analyze the L2 error of neural network regression estimates with one hidden layer. Under the assumption that the Fourier transform of the regression function decays suitably fast, we show that an estimate, where all initial weights are chosen according to proper uniform distributions and where the weights are learned by gradient descent, achieves a rate of convergence of 1n (up to a logarithmic factor). Our statistical analysis implies that the key aspect behind this result is the proper choice of the initial inner weights and the adjustment of the outer weights via gradient descent. This indicates that we can also simply use linear least squares to choose the outer weights. We prove a corresponding theoretical result and compare our new linear least squares neural network estimate with standard neural network estimates via simulated data. Our simulations show that our theoretical considerations lead to an estimate with an improved performance in many cases.

Acknowledgements

The authors are grateful to the two anonymous referees and the Associate Editor Mark Podolskij for their constructive comments that improved the quality of this paper.

Citation

Download Citation

Alina Braun. Michael Kohler. Sophie Langer. Harro Walk. "Convergence rates for shallow neural networks learned by gradient descent." Bernoulli 30 (1) 475 - 502, February 2024. https://doi.org/10.3150/23-BEJ1605

Information

Received: 1 December 2021; Published: February 2024
First available in Project Euclid: 8 November 2023

zbMATH: 07788892
MathSciNet: MR4665586
Digital Object Identifier: 10.3150/23-BEJ1605

Keywords: deep learning , gradient descent , neural networks , rate of convergence

JOURNAL ARTICLE
28 PAGES

This article is only available to subscribers.
It is not available for individual sale.
+ SAVE TO MY LIBRARY

Vol.30 • No. 1 • February 2024
Back to Top