Open Access
Translator Disclaimer
2015 Brittleness of Bayesian inference under finite information in a continuous world
Houman Owhadi, Clint Scovel, Tim Sullivan
Electron. J. Statist. 9(1): 1-79 (2015). DOI: 10.1214/15-EJS989


We derive, in the classical framework of Bayesian sensitivity analysis, optimal lower and upper bounds on posterior values obtained from Bayesian models that exactly capture an arbitrarily large number of finite-dimensional marginals of the data-generating distribution and/or that are as close as desired to the data-generating distribution in the Prokhorov or total variation metrics; these bounds show that such models may still make the largest possible prediction error after conditioning on an arbitrarily large number of sample data measured at finite precision. These results are obtained through the development of a reduction calculus for optimization problems over measures on spaces of measures. We use this calculus to investigate the mechanisms that generate brittleness/robustness and, in particular, we observe that learning and robustness are antagonistic properties. It is now well understood that the numerical resolution of PDEs requires the satisfaction of specific stability conditions. Is there a missing stability condition for using Bayesian inference in a continuous world under finite information?


Download Citation

Houman Owhadi. Clint Scovel. Tim Sullivan. "Brittleness of Bayesian inference under finite information in a continuous world." Electron. J. Statist. 9 (1) 1 - 79, 2015.


Published: 2015
First available in Project Euclid: 2 February 2015

zbMATH: 1305.62123
MathSciNet: MR3306570
Digital Object Identifier: 10.1214/15-EJS989

Primary: 62F15 , 62G35
Secondary: 62A01 , 62E20 , 62F12 , 62G20

Keywords: Bayesian inference , misspecification , optimal uncertainty quantification , robustness , uncertainty quantification

Rights: Copyright © 2015 The Institute of Mathematical Statistics and the Bernoulli Society


Vol.9 • No. 1 • 2015
Back to Top