Open Access
2017 Cross-calibration of probabilistic forecasts
Christof Strähl, Johanna Ziegel
Electron. J. Statist. 11(1): 608-639 (2017). DOI: 10.1214/17-EJS1244

Abstract

When providing probabilistic forecasts for uncertain future events, it is common to strive for calibrated forecasts, that is, the predictive distribution should be compatible with the observed outcomes. Often, there are several competing forecasters of different skill. We extend common notions of calibration where each forecaster is analyzed individually, to stronger notions of cross-calibration where each forecaster is analyzed with respect to the other forecasters. In particular, cross-calibration distinguishes forecasters with respect to increasing information sets. We provide diagnostic tools and statistical tests to assess cross-calibration. The methods are illustrated in simulation examples and applied to probabilistic forecasts for inflation rates by the Bank of England. Computer code and supplementary material (Strähl and Ziegel, 2017a,b) are available online.

Citation

Download Citation

Christof Strähl. Johanna Ziegel. "Cross-calibration of probabilistic forecasts." Electron. J. Statist. 11 (1) 608 - 639, 2017. https://doi.org/10.1214/17-EJS1244

Information

Received: 1 November 2016; Published: 2017
First available in Project Euclid: 3 March 2017

zbMATH: 1388.62276
MathSciNet: MR3619318
Digital Object Identifier: 10.1214/17-EJS1244

Keywords: Calibration , prediction space , predictive distribution , probability integral transform , proper scoring rule

Vol.11 • No. 1 • 2017
Back to Top