Open Access
2019 Higher order Langevin Monte Carlo algorithm
Sotirios Sabanis, Ying Zhang
Electron. J. Statist. 13(2): 3805-3850 (2019). DOI: 10.1214/19-EJS1615

Abstract

A new (unadjusted) Langevin Monte Carlo (LMC) algorithm with improved rates in total variation and in Wasserstein distance is presented. All these are obtained in the context of sampling from a target distribution $\pi$ that has a density $\hat{\pi}$ on $\mathbb{R}^{d}$ known up to a normalizing constant. Moreover, $-\log\hat{\pi}$ is assumed to have a locally Lipschitz gradient and its third derivative is locally Hölder continuous with exponent $\beta \in (0,1]$. Non-asymptotic bounds are obtained for the convergence to stationarity of the new sampling method with convergence rate $1+\beta/2$ in Wasserstein distance, while it is shown that the rate is 1 in total variation even in the absence of convexity. Finally, in the case where $-\log \hat{\pi}$ is strongly convex and its gradient is Lipschitz continuous, explicit constants are provided.

Citation

Download Citation

Sotirios Sabanis. Ying Zhang. "Higher order Langevin Monte Carlo algorithm." Electron. J. Statist. 13 (2) 3805 - 3850, 2019. https://doi.org/10.1214/19-EJS1615

Information

Received: 1 November 2018; Published: 2019
First available in Project Euclid: 3 October 2019

zbMATH: 07113731
MathSciNet: MR4015336
Digital Object Identifier: 10.1214/19-EJS1615

Subjects:
Primary: 62L10 , 65C05

Keywords: higher order algorithm , machine learning , Markov chain Monte Carlo , rate of convergence , sampling problem , super-linear coefficients

Vol.13 • No. 2 • 2019
Back to Top