September 2022 Unadjusted Langevin algorithm for sampling a mixture of weakly smooth potentials
Dao Nguyen
Author Affiliations +
Braz. J. Probab. Stat. 36(3): 504-539 (September 2022). DOI: 10.1214/22-BJPS538

Abstract

Discretization of continuous-time diffusion processes is a widely recognized method for sampling. However, it seems to be a considerable restriction when the potentials are often required to be smooth (gradient Lipschitz). This paper studies the problem of sampling through Euler discretization, where the potential function is assumed to be a mixture of weakly smooth distributions and satisfies weakly dissipative. We establish the convergence in Kullback–Leibler (KL) divergence with the number of iterations to reach ϵ-neighborhood of a target distribution in only polynomial dependence on the dimension. We relax the degenerated convex at infinity conditions of (Erdogdu and Hosseinzadeh (2020)) and prove convergence guarantees under Poincaré inequality or non-strongly convex outside the ball. In addition, we also provide convergence in Lβ-Wasserstein metric for the smoothing potential.

Citation

Download Citation

Dao Nguyen. "Unadjusted Langevin algorithm for sampling a mixture of weakly smooth potentials." Braz. J. Probab. Stat. 36 (3) 504 - 539, September 2022. https://doi.org/10.1214/22-BJPS538

Information

Received: 1 November 2021; Accepted: 1 June 2022; Published: September 2022
First available in Project Euclid: 26 September 2022

MathSciNet: MR4489179
zbMATH: 07599102
Digital Object Identifier: 10.1214/22-BJPS538

Keywords: Kullback–Leibler divergence , Langevin Monte Carlo , mixture weakly smooth , non-convex sampling

Rights: Copyright © 2022 Brazilian Statistical Association

Vol.36 • No. 3 • September 2022
Back to Top