May 2023 On the rate of convergence of a deep recurrent neural network estimate in a regression problem with dependent data
Michael Kohler, Adam Krzyżak
Author Affiliations +
Bernoulli 29(2): 1663-1685 (May 2023). DOI: 10.3150/22-BEJ1516

Abstract

A regression problem with dependent data is considered. Regularity assumptions on the dependency of the data are introduced, and it is shown that under suitable structural assumptions on the regression function a deep recurrent neural network estimate is able to circumvent the curse of dimensionality.

Acknowledgements

We thank the Editors and the Reviewers for helpful comments. The second Author would like to acknowledge the support from the Natural Sciences and Engineering Research Council of Canada under Grant RGPIN-2020-06793.

Citation

Download Citation

Michael Kohler. Adam Krzyżak. "On the rate of convergence of a deep recurrent neural network estimate in a regression problem with dependent data." Bernoulli 29 (2) 1663 - 1685, May 2023. https://doi.org/10.3150/22-BEJ1516

Information

Received: 1 June 2021; Published: May 2023
First available in Project Euclid: 19 February 2023

MathSciNet: MR4550240
zbMATH: 07666835
Digital Object Identifier: 10.3150/22-BEJ1516

Keywords: curse of dimensionality , nonparametric regression estimation , rate of convergence , recurrent neural networks

JOURNAL ARTICLE
23 PAGES

This article is only available to subscribers.
It is not available for individual sale.
+ SAVE TO MY LIBRARY

Vol.29 • No. 2 • May 2023
Back to Top