Open Access
2023 Simulation-Based Calibration Checking for Bayesian Computation: The Choice of Test Quantities Shapes Sensitivity
Martin Modrák, Angie H. Moon, Shinyoung Kim, Paul Bürkner, Niko Huurre, Kateřina Faltejsková, Andrew Gelman, Aki Vehtari
Author Affiliations +
Bayesian Anal. Advance Publication 1-28 (2023). DOI: 10.1214/23-BA1404

Abstract

Simulation-based calibration checking (SBC) is a practical method to validate computationally-derived posterior distributions or their approximations. In this paper, we introduce a new variant of SBC to alleviate several known problems. Our variant allows the user to in principle detect any possible issue with the posterior, while previously reported implementations could never detect large classes of problems including when the posterior is equal to the prior. This is made possible by including additional data-dependent test quantities when running SBC. We argue and demonstrate that the joint likelihood of the data is an especially useful test quantity. Some other types of test quantities and their theoretical and practical benefits are also investigated. We provide theoretical analysis of SBC, thereby providing a more complete understanding of the underlying statistical mechanisms. We also bring attention to a relatively common mistake in the literature and clarify the difference between SBC and checks based on the data-averaged posterior. We support our recommendations with numerical case studies on a multivariate normal example and a case study in implementing an ordered simplex data type for use with Hamiltonian Monte Carlo. The SBC variant introduced in this paper is implemented in the SBC R package.

Funding Statement

We thank Garud Iyengar and Henry Lam for helpful discussions on theory and proofs (Appendix A) and David Yao for bringing attention to stochastic ordering which motivated delving into families of test quantities. We thank Feras Saad for alerting us that a previous version of this paper did not correctly reflect their contributions. This work was supported by the ELIXIR CZ research infrastructure project (Ministry of Youth, Education and Sports of the Czech Republic, Grant No: LM2023055), including access to computing and storage facilities; the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) under Germany’s Excellence Strategy — EXC-2075 - 390740016 (the Stuttgart Cluster of Excellence SimTech); the U.S. National Science Foundation, National Institutes of Health, and Office of Naval Research; and the Academy of Finland Flagship programme: Finnish Center for Artificial Intelligence.

Citation

Download Citation

Martin Modrák. Angie H. Moon. Shinyoung Kim. Paul Bürkner. Niko Huurre. Kateřina Faltejsková. Andrew Gelman. Aki Vehtari. "Simulation-Based Calibration Checking for Bayesian Computation: The Choice of Test Quantities Shapes Sensitivity." Bayesian Anal. Advance Publication 1 - 28, 2023. https://doi.org/10.1214/23-BA1404

Information

Published: 2023
First available in Project Euclid: 23 November 2023

Digital Object Identifier: 10.1214/23-BA1404

Subjects:
Primary: 62C10

Keywords: Calibration , probabilistic programming , Software testing

Advance Publication
Back to Top