2020 A stochastic version of Stein variational gradient descent for efficient sampling
Lei Li, Yingzhou Li, Jian-Guo Liu, Zibu Liu, Jianfeng Lu
Commun. Appl. Math. Comput. Sci. 15(1): 37-63 (2020). DOI: 10.2140/camcos.2020.15.37

Abstract

We propose in this work RBM-SVGD, a stochastic version of the Stein variational gradient descent (SVGD) method for efficiently sampling from a given probability measure, which is thus useful for Bayesian inference. The method is to apply the random batch method (RBM) for interacting particle systems proposed by Jin et al. to the interacting particle systems in SVGD. While keeping the behaviors of SVGD, it reduces the computational cost, especially when the interacting kernel has long range. We prove that the one marginal distribution of the particles generated by this method converges to the one marginal of the interacting particle systems under Wasserstein-2 distance on fixed time interval [0,T]. Numerical examples verify the efficiency of this new version of SVGD.

Citation

Download Citation

Lei Li. Yingzhou Li. Jian-Guo Liu. Zibu Liu. Jianfeng Lu. "A stochastic version of Stein variational gradient descent for efficient sampling." Commun. Appl. Math. Comput. Sci. 15 (1) 37 - 63, 2020. https://doi.org/10.2140/camcos.2020.15.37

Information

Received: 10 April 2019; Revised: 12 November 2019; Accepted: 14 December 2019; Published: 2020
First available in Project Euclid: 25 June 2020

zbMATH: 07224509
MathSciNet: MR4113783
Digital Object Identifier: 10.2140/camcos.2020.15.37

Subjects:
Primary: 62D05 , 65C35

Keywords: KL divergence , MCMC , nonparametric variational inference , random batch method , RBM-SVGD , ‎reproducing kernel Hilbert ‎space

Rights: Copyright © 2020 Mathematical Sciences Publishers

JOURNAL ARTICLE
27 PAGES

This article is only available to subscribers.
It is not available for individual sale.
+ SAVE TO MY LIBRARY

Vol.15 • No. 1 • 2020
MSP
Back to Top