Abstract
A regression problem with dependent data is considered. Regularity assumptions on the dependency of the data are introduced, and it is shown that under suitable structural assumptions on the regression function a deep recurrent neural network estimate is able to circumvent the curse of dimensionality.
Acknowledgements
We thank the Editors and the Reviewers for helpful comments. The second Author would like to acknowledge the support from the Natural Sciences and Engineering Research Council of Canada under Grant RGPIN-2020-06793.
Citation
Michael Kohler. Adam Krzyżak. "On the rate of convergence of a deep recurrent neural network estimate in a regression problem with dependent data." Bernoulli 29 (2) 1663 - 1685, May 2023. https://doi.org/10.3150/22-BEJ1516
Information