Translator Disclaimer
2017 Beyond sigmoids: How to obtain well-calibrated probabilities from binary classifiers with beta calibration
Meelis Kull, Telmo M. Silva Filho, Peter Flach
Electron. J. Statist. 11(2): 5052-5080 (2017). DOI: 10.1214/17-EJS1338SI

Abstract

For optimal decision making under variable class distributions and misclassification costs a classifier needs to produce well-calibrated estimates of the posterior probability. Isotonic calibration is a powerful non-parametric method that is however prone to overfitting on smaller datasets; hence a parametric method based on the logistic sigmoidal curve is commonly used. While logistic calibration is designed for normally distributed per-class scores, we demonstrate experimentally that many classifiers including Naive Bayes and Adaboost suffer from a particular distortion where these score distributions are heavily skewed. In such cases logistic calibration can easily yield probability estimates that are worse than the original scores. Moreover, the logistic curve family does not include the identity function, and hence logistic calibration can easily uncalibrate a perfectly calibrated classifier.

In this paper we solve all these problems with a richer class of parametric calibration maps based on the beta distribution. We derive the method from first principles and show that fitting it is as easy as fitting a logistic curve. Extensive experiments show that beta calibration is superior to logistic calibration for a wide range of classifiers: Naive Bayes, Adaboost, random forest, logistic regression, support vector machine and multi-layer perceptron. If the original classifier is already calibrated, then beta calibration learns a function close to the identity. On this we build a statistical test to recognise if the model deviates from being well-calibrated.

Citation

Download Citation

Meelis Kull. Telmo M. Silva Filho. Peter Flach. "Beyond sigmoids: How to obtain well-calibrated probabilities from binary classifiers with beta calibration." Electron. J. Statist. 11 (2) 5052 - 5080, 2017. https://doi.org/10.1214/17-EJS1338SI

Information

Received: 1 June 2017; Published: 2017
First available in Project Euclid: 15 December 2017

zbMATH: 1384.62197
MathSciNet: MR3738205
Digital Object Identifier: 10.1214/17-EJS1338SI

JOURNAL ARTICLE
29 PAGES


SHARE
Vol.11 • No. 2 • 2017
Back to Top