Abstract
Generalized Bayesian inference replaces the likelihood in the Bayesian posterior with the exponential of a loss function connecting parameter values and observations. As a loss function, it is possible to use Scoring Rules (SRs), which evaluate the match between the observation and the probabilistic model for given parameter values. In this work, we leverage this Scoring Rule posterior for Bayesian Likelihood-Free Inference (LFI). In LFI, we can sample from the model but not evaluate the likelihood; hence, we use the Energy and Kernel SRs in the SR posterior, as they admit unbiased empirical estimates. While traditional Pseudo-Marginal (PM) Markov Chain Monte Carlo (MCMC) can be applied to the SR posterior, it mixes poorly for concentrated targets, such as those obtained with many observations. As such, we propose to use Stochastic Gradient (SG) MCMC, which improves performance over PM-MCMC and scales to higher-dimensional setups as it is rejection-free. SG-MCMC requires differentiating the simulator model; we achieve this effortlessly by implementing the simulator models using automatic differentiation libraries. We compare SG-MCMC sampling for the SR posterior with related LFI approaches and find that the former scales to larger sample sizes and works well on the raw data, while other methods require determining suitable summary statistics. On a chaotic dynamical system from meteorology, our method even allows inferring the parameters of a neural network used to parametrize a part of the update equations.
Funding Statement
LP was supported by the EPSRC and MRC through the OxWaSP CDT programme (EP/L016710/1) during his PhD, which also funded part of the computational resources used to perform this work. LP is currently funded by US DARPA grant HR00112120007 (RECoG-AI). SK is supported by the EPSRC (grant number EP/S023569/1). RD is funded by EPSRC (grant nos. EP/V025899/1, EP/T017112/1) and NERC (grant no. NE/T00973X/1).
Acknowledgments
The authors would like to thank the anonymous referees, an Associate Editor and the Editor for their constructive comments that improved the quality of this paper. We thank Jeremias Knoblauch, François-Xavier Briol, Takuo Matsubara, Geoff Nicholls, Benedict Leimkuhler and Sebastian Schmon for valuable feedback and suggestions on earlier versions of this work. We also thank Alex Shestopaloff for providing code for exact MCMC for the M/G/1 model. Lorenzo Pacchiardi conducted part of this work during his PhD studies at the Department of Statistics, University of Oxford. Sherman Khoo conducted part of this work during his MSc studies at the Department of Statistics, University of Warwick.
Citation
Lorenzo Pacchiardi. Sherman Khoo. Ritabrata Dutta. "Generalized Bayesian likelihood-free inference." Electron. J. Statist. 18 (2) 3628 - 3686, 2024. https://doi.org/10.1214/24-EJS2283
Information