Open Access
2017 On the interpretability of conditional probability estimates in the agnostic setting
Yihan Gao, Aditya Parameswaran, Jian Peng
Electron. J. Statist. 11(2): 5198-5231 (2017). DOI: 10.1214/17-EJS1376SI

Abstract

We study the interpretability of conditional probability estimates for binary classification under the agnostic setting or scenario. Under the agnostic setting, conditional probability estimates do not necessarily reflect the true conditional probabilities. Instead, they have a certain calibration property: among all data points that the classifier has predicted $\mathcal{P}(Y=1|X)=p$, $p$ portion of them actually have label $Y=1$. For cost-sensitive decision problems, this calibration property provides adequate support for us to use Bayes Decision Rule. In this paper, we define a novel measure for the calibration property together with its empirical counterpart, and prove a uniform convergence result between them. This new measure enables us to formally justify the calibration property of conditional probability estimations. It also provides new insights on the problem of estimating and calibrating conditional probabilities, and allows us to reliably estimate the expected cost of decision rules when applied to an unlabeled dataset.

Citation

Download Citation

Yihan Gao. Aditya Parameswaran. Jian Peng. "On the interpretability of conditional probability estimates in the agnostic setting." Electron. J. Statist. 11 (2) 5198 - 5231, 2017. https://doi.org/10.1214/17-EJS1376SI

Information

Received: 1 June 2017; Published: 2017
First available in Project Euclid: 15 December 2017

zbMATH: 06825044
MathSciNet: MR3738209
Digital Object Identifier: 10.1214/17-EJS1376SI

Vol.11 • No. 2 • 2017
Back to Top