Open Access
August 2016 Nonparametric stochastic approximation with large step-sizes
Aymeric Dieuleveut, Francis Bach
Ann. Statist. 44(4): 1363-1399 (August 2016). DOI: 10.1214/15-AOS1391


We consider the random-design least-squares regression problem within the reproducing kernel Hilbert space (RKHS) framework. Given a stream of independent and identically distributed input/output data, we aim to learn a regression function within an RKHS $\mathcal{H}$, even if the optimal predictor (i.e., the conditional expectation) is not in $\mathcal{H}$. In a stochastic approximation framework where the estimator is updated after each observation, we show that the averaged unregularized least-mean-square algorithm (a form of stochastic gradient descent), given a sufficient large step-size, attains optimal rates of convergence for a variety of regimes for the smoothnesses of the optimal prediction function and the functions in $\mathcal{H}$. Our results apply as well in the usual finite-dimensional setting of parametric least-squares regression, showing adaptivity of our estimator to the spectral decay of the covariance matrix of the covariates.


Download Citation

Aymeric Dieuleveut. Francis Bach. "Nonparametric stochastic approximation with large step-sizes." Ann. Statist. 44 (4) 1363 - 1399, August 2016.


Received: 1 September 2014; Revised: 1 July 2015; Published: August 2016
First available in Project Euclid: 7 July 2016

zbMATH: 1346.60041
MathSciNet: MR3519927
Digital Object Identifier: 10.1214/15-AOS1391

Primary: 60K35

Keywords: ‎reproducing kernel Hilbert ‎space , stochastic approximation

Rights: Copyright © 2016 Institute of Mathematical Statistics

Vol.44 • No. 4 • August 2016
Back to Top