February 2025 An efficient averaged stochastic Gauss-Newton algorithm for estimating parameters of nonlinear regressions models
Peggy Cénac, Antoine Godichon-Baggioni, Bruno Portier
Author Affiliations +
Bernoulli 31(1): 1-29 (February 2025). DOI: 10.3150/23-BEJ1637

Abstract

Nonlinear regression models are a standard tool for modelling real phenomena, with several applications in machine learning, ecology, econometrics, and so forth. Estimating the parameters of these models has attracted a lot of attention during many years. We focus here on a recursive method for estimating parameters of nonlinear regressions. Indeed, these kinds of methods, whose most famous instances are probably the stochastic gradient algorithm and its averaged version, enable to deal efficiently with massive data arriving sequentially. Nevertheless, they can be, in practice, non-robust when the eigenvalues of the Hessian of the functional we would like to minimize are at different scales. To avoid this problem, we first focus on an online Stochastic Gauss-Newton algorithm. In order to improve the estimates behaviour in the case of bad initialization, we then introduce a new Averaged Stochastic Gauss-Newton algorithm and prove its asymptotic efficiency.

Funding Statement

The IMB receives support from the EIPHI Graduate School (contract ANR-17-EURE-0002)

Citation

Download Citation

Peggy Cénac. Antoine Godichon-Baggioni. Bruno Portier. "An efficient averaged stochastic Gauss-Newton algorithm for estimating parameters of nonlinear regressions models." Bernoulli 31 (1) 1 - 29, February 2025. https://doi.org/10.3150/23-BEJ1637

Information

Received: 1 June 2022; Published: February 2025
First available in Project Euclid: 30 October 2024

Digital Object Identifier: 10.3150/23-BEJ1637

Keywords: Nonlinear regression model , online estimation , stochastic Gauss-Newton algorithm , stochastic Newton algorithm , stochastic optimization

Vol.31 • No. 1 • February 2025
Back to Top