Abstract
In this paper, we extend the framework of the convergence of stochastic approximations. Such a procedure is used in many methods such as parameters estimation inside a Metropolis Hastings algorithm, stochastic gradient descent or stochastic Expectation Maximization algorithm. It is given by
where is a sequence of random variables following a parametric distribution which depends on , and is a step sequence. The convergence of such a stochastic approximation has already been proved under an assumption of geometric ergodicity of the Markov dynamic. However, in many practical situations this hypothesis is not satisfied, for instance for any heavy tail target distribution in a Monte Carlo Metropolis Hastings algorithm. In this paper, we relax this hypothesis and prove the convergence of the stochastic approximation by only assuming a subgeometric ergodicity of the Markov dynamic. This result opens up the possibility to derive more generic algorithms with proven convergence. As an example, we first study an adaptive Markov Chain Monte Carlo algorithm where the proposal distribution is adapted by learning the variance of a heavy tail target distribution. We then apply our work to the Independent Component Analysis when a positive heavy tail noise leads to a subgeometric dynamic in an Expectation Maximization algorithm.
Funding Statement
This work has been partly funded by the European Research Council with grant 678304. This work was supported by a grant of Paris Artificial Intelligence Research Institute : ANR19-P3IA-0001 - PRAIRIE IA - Paris Artificial Intelligence Research Institute (2019).
Citation
Vianney Debavelaere. Stanley Durrleman. Stéphanie Allassonnière. "On the convergence of stochastic approximations under a subgeometric ergodic Markov dynamic." Electron. J. Statist. 15 (1) 1583 - 1609, 2021. https://doi.org/10.1214/21-EJS1827
Information