Abstract
We consider the problem of $n$-class classification ($n\geq2$), where the classifier can choose to abstain from making predictions at a given cost, say, a factor $\alpha$ of the cost of misclassification. Our goal is to design consistent algorithms for such $n$-class classification problems with a ‘reject option’; while such algorithms are known for the binary ($n=2$) case, little has been understood for the general multiclass case. We show that the well known Crammer-Singer surrogate and the one-vs-all hinge loss, albeit with a different predictor than the standard argmax, yield consistent algorithms for this problem when $\alpha=\frac{1}{2}$. More interestingly, we design a new convex surrogate, which we call the binary encoded predictions surrogate, that is also consistent for this problem when $\alpha=\frac{1}{2}$ and operates on a much lower dimensional space ($\log(n)$ as opposed to $n$). We also construct modified versions of all these three surrogates to be consistent for any given $\alpha\in[0,\frac{1}{2}]$.
Citation
Harish G. Ramaswamy. Ambuj Tewari. Shivani Agarwal. "Consistent algorithms for multiclass classification with an abstain option." Electron. J. Statist. 12 (1) 530 - 554, 2018. https://doi.org/10.1214/17-EJS1388
Information