August 2024 How do noise tails impact on deep ReLU networks?
Jianqing Fan, Yihong Gu, Wen-Xin Zhou
Author Affiliations +
Ann. Statist. 52(4): 1845-1871 (August 2024). DOI: 10.1214/24-AOS2428

Abstract

This paper investigates the stability of deep ReLU neural networks for nonparametric regression under the assumption that the noise has only a finite pth moment. We unveil how the optimal rate of convergence depends on p, the degree of smoothness and the intrinsic dimension in a class of nonparametric regression functions with hierarchical composition structure when both the adaptive Huber loss and deep ReLU neural networks are used. This optimal rate of convergence cannot be obtained by the ordinary least squares but can be achieved by the Huber loss with a properly chosen parameter that adapts to the sample size, smoothness, and moment parameters. A concentration inequality for the adaptive Huber ReLU neural network estimators with allowable optimization errors is also derived. To establish a matching lower bound within the class of neural network estimators using the Huber loss, we employ a different strategy from the traditional route: constructing a deep ReLU network estimator that has a better empirical loss than the true function and the difference between these two functions furnishes a low bound. This step is related to the Huberization bias, yet more critically to the approximability of deep ReLU networks. As a result, we also contribute some new results on the approximation theory of deep ReLU neural networks.

Funding Statement

J. Fan’s research is by the ONR Grants N00014-19-1-2120 and N00014-22-1-2340 and NSF Grants DMS-2052926, DMS-2053832 and DMS-2210833.
W.-X. Zhou was supported by the NSF Grant DMS-2401268.

Acknowledgments

The authors would like to thank three anonymous referees, an Associate Editor and the Editor for their constructive comments that improved the quality of this paper.

Citation

Download Citation

Jianqing Fan. Yihong Gu. Wen-Xin Zhou. "How do noise tails impact on deep ReLU networks?." Ann. Statist. 52 (4) 1845 - 1871, August 2024. https://doi.org/10.1214/24-AOS2428

Information

Received: 1 December 2022; Revised: 1 November 2023; Published: August 2024
First available in Project Euclid: 3 October 2024

Digital Object Identifier: 10.1214/24-AOS2428

Subjects:
Primary: 62G08
Secondary: 62G35

Keywords: approximablility of ReLU networks , composition of functions , heavy tails , Optimal rates , robustness , truncation

Rights: Copyright © 2024 Institute of Mathematical Statistics

Vol.52 • No. 4 • August 2024
Back to Top