Abstract
Nonlinear regression models are a standard tool for modelling real phenomena, with several applications in machine learning, ecology, econometrics, and so forth. Estimating the parameters of these models has attracted a lot of attention during many years. We focus here on a recursive method for estimating parameters of nonlinear regressions. Indeed, these kinds of methods, whose most famous instances are probably the stochastic gradient algorithm and its averaged version, enable to deal efficiently with massive data arriving sequentially. Nevertheless, they can be, in practice, non-robust when the eigenvalues of the Hessian of the functional we would like to minimize are at different scales. To avoid this problem, we first focus on an online Stochastic Gauss-Newton algorithm. In order to improve the estimates behaviour in the case of bad initialization, we then introduce a new Averaged Stochastic Gauss-Newton algorithm and prove its asymptotic efficiency.
Funding Statement
The IMB receives support from the EIPHI Graduate School (contract ANR-17-EURE-0002)
Citation
Peggy Cénac. Antoine Godichon-Baggioni. Bruno Portier. "An efficient averaged stochastic Gauss-Newton algorithm for estimating parameters of nonlinear regressions models." Bernoulli 31 (1) 1 - 29, February 2025. https://doi.org/10.3150/23-BEJ1637
Information