Translator Disclaimer
June 2018 Variational Hamiltonian Monte Carlo via Score Matching
Cheng Zhang, Babak Shahbaba, Hongkai Zhao
Bayesian Anal. 13(2): 485-506 (June 2018). DOI: 10.1214/17-BA1060


Traditionally, the field of computational Bayesian statistics has been divided into two main subfields: variational methods and Markov chain Monte Carlo (MCMC). In recent years, however, several methods have been proposed based on combining variational Bayesian inference and MCMC simulation in order to improve their overall accuracy and computational efficiency. This marriage of fast evaluation and flexible approximation provides a promising means of designing scalable Bayesian inference methods. In this paper, we explore the possibility of incorporating variational approximation into a state-of-the-art MCMC method, Hamiltonian Monte Carlo (HMC), to reduce the required expensive computation involved in the sampling procedure, which is the bottleneck for many applications of HMC in big data problems. To this end, we exploit the regularity in parameter space to construct a free-form approximation of the target distribution by a fast and flexible surrogate function using an optimized additive model of proper random basis, which can also be viewed as a single-hidden layer feedforward neural network. The surrogate function provides sufficiently accurate approximation while allowing for fast computation in the sampling procedure, resulting in an efficient approximate Bayesian inference algorithm. We demonstrate the advantages of our proposed method using both synthetic and real data problems.


Download Citation

Cheng Zhang. Babak Shahbaba. Hongkai Zhao. "Variational Hamiltonian Monte Carlo via Score Matching." Bayesian Anal. 13 (2) 485 - 506, June 2018.


Published: June 2018
First available in Project Euclid: 25 July 2017

zbMATH: 06989957
MathSciNet: MR3780432
Digital Object Identifier: 10.1214/17-BA1060

Primary: 65C60
Secondary: 65C05


Vol.13 • No. 2 • June 2018
Back to Top