Translator Disclaimer
November 2019 Subjective Bayesian testing using calibrated prior probabilities
Dan J. Spitzner
Braz. J. Probab. Stat. 33(4): 861-893 (November 2019). DOI: 10.1214/18-BJPS424

Abstract

This article proposes a calibration scheme for Bayesian testing that coordinates analytically-derived statistical performance considerations with expert opinion. In other words, the scheme is effective and meaningful for incorporating objective elements into subjective Bayesian inference. It explores a novel role for default priors as anchors for calibration rather than substitutes for prior knowledge. Ideas are developed for use with multiplicity adjustments in multiple-model contexts, and to address the issue of prior sensitivity of Bayes factors. Along the way, the performance properties of an existing multiplicity adjustment related to the Poisson distribution are clarified theoretically. Connections of the overall calibration scheme to the Schwarz criterion are also explored. The proposed framework is examined and illustrated on a number of existing data sets related to problems in clinical trials, forensic pattern matching, and log-linear models methodology.

Citation

Download Citation

Dan J. Spitzner. "Subjective Bayesian testing using calibrated prior probabilities." Braz. J. Probab. Stat. 33 (4) 861 - 893, November 2019. https://doi.org/10.1214/18-BJPS424

Information

Received: 1 March 2018; Accepted: 1 November 2018; Published: November 2019
First available in Project Euclid: 26 August 2019

zbMATH: 07120737
MathSciNet: MR3996320
Digital Object Identifier: 10.1214/18-BJPS424

Rights: Copyright © 2019 Brazilian Statistical Association

JOURNAL ARTICLE
33 PAGES

This article is only available to subscribers.
It is not available for individual sale.
+ SAVE TO MY LIBRARY

SHARE
Vol.33 • No. 4 • November 2019
Back to Top