Open Access
December 2017 Asymptotic bias of stochastic gradient search
Vladislav B. Tadić, Arnaud Doucet
Ann. Appl. Probab. 27(6): 3255-3304 (December 2017). DOI: 10.1214/16-AAP1272

Abstract

The asymptotic behavior of the stochastic gradient algorithm using biased gradient estimates is analyzed. Relying on arguments based on dynamic system theory (chain-recurrence) and differential geometry (Yomdin theorem and Lojasiewicz inequalities), upper bounds on the asymptotic bias of this algorithm are derived. The results hold under mild conditions and cover a broad class of algorithms used in machine learning, signal processing and statistics.

Citation

Download Citation

Vladislav B. Tadić. Arnaud Doucet. "Asymptotic bias of stochastic gradient search." Ann. Appl. Probab. 27 (6) 3255 - 3304, December 2017. https://doi.org/10.1214/16-AAP1272

Information

Received: 1 November 2015; Revised: 1 November 2016; Published: December 2017
First available in Project Euclid: 15 December 2017

zbMATH: 06848266
MathSciNet: MR3737925
Digital Object Identifier: 10.1214/16-AAP1272

Subjects:
Primary: 62L20
Secondary: 90C15 , 93E12 , 93E35

Keywords: biased gradient estimation , chain-recurrence , Lojasiewicz inequalities , Stochastic gradient search , Yomdin theorem

Rights: Copyright © 2017 Institute of Mathematical Statistics

Vol.27 • No. 6 • December 2017
Back to Top