Open Access
2018 Efficient MCMC for Gibbs random fields using pre-computation
Aidan Boland, Nial Friel, Florian Maire
Electron. J. Statist. 12(2): 4138-4179 (2018). DOI: 10.1214/18-EJS1504

Abstract

Bayesian inference of Gibbs random fields (GRFs) is often referred to as a doubly intractable problem, since the normalizing constant of both the likelihood function and the posterior distribution are not in closed-form. The exploration of the posterior distribution of such models is typically carried out with a sophisticated Markov chain Monte Carlo (MCMC) method, the exchange algorithm [28], which requires simulations from the likelihood function at each iteration. The purpose of this paper is to consider an approach to dramatically reduce this computational overhead. To this end we introduce a novel class of algorithms which use realizations of the GRF model, simulated offline, at locations specified by a grid that spans the parameter space. This strategy speeds up dramatically the posterior inference, as illustrated on several examples. However, using the pre-computed graphs introduces a noise in the MCMC algorithm, which is no longer exact. We study the theoretical behaviour of the resulting approximate MCMC algorithm and derive convergence bounds using a recent theoretical development on approximate MCMC methods.

Citation

Download Citation

Aidan Boland. Nial Friel. Florian Maire. "Efficient MCMC for Gibbs random fields using pre-computation." Electron. J. Statist. 12 (2) 4138 - 4179, 2018. https://doi.org/10.1214/18-EJS1504

Information

Received: 1 January 2018; Published: 2018
First available in Project Euclid: 13 December 2018

zbMATH: 07003239
MathSciNet: MR3890764
Digital Object Identifier: 10.1214/18-EJS1504

Keywords: exponential random graph models , Gibbs random fields , MCMC

Vol.12 • No. 2 • 2018
Back to Top