Abstract
We propose a general class of randomized gradient estimates to be employed in a recursive search for the minimum of an unknown multivariate regression function. Here only two observations per iteration step are used. Special cases include random direction stochastic approximation (Kushner and Clark), simultaneous perturbation stochastic approximation (Spall) and a special kernel based stochastic approximation method (Polyak and Tsybakov). If the unknown regression is p-smooth ($p\ge 2$) at the point of minimum, these methods achieve the optimal rate of convergence $O(n^{-(p-1)/(2p)})$. For both the classical stochastic approximation scheme (Kiefer and Wolfowitz) and the averaging scheme (Ruppert and Polyak) the related asymptotic distributions are computed.
Citation
Jürgen Dippon. "Accelerated randomized stochastic optimization." Ann. Statist. 31 (4) 1260 - 1281, August 2003. https://doi.org/10.1214/aos/1059655913
Information