The Annals of Statistics
- Ann. Statist.
- Volume 34, Number 5 (2006), 2326-2366.
Risk bounds for statistical learning
We propose a general theorem providing upper bounds for the risk of an empirical risk minimizer (ERM).We essentially focus on the binary classification framework. We extend Tsybakov’s analysis of the risk of an ERM under margin type conditions by using concentration inequalities for conveniently weighted empirical processes. This allows us to deal with ways of measuring the “size” of a class of classifiers other than entropy with bracketing as in Tsybakov’s work. In particular, we derive new risk bounds for the ERM when the classification rules belong to some VC-class under margin conditions and discuss the optimality of these bounds in a minimax sense.
Ann. Statist. Volume 34, Number 5 (2006), 2326-2366.
First available in Project Euclid: 23 January 2007
Permanent link to this document
Digital Object Identifier
Mathematical Reviews number (MathSciNet)
Zentralblatt MATH identifier
Classification concentration inequalities empirical processes entropy with bracketing minimax estimation model selection pattern recognition regression estimation statistical learning structural minimization of the risk VC-class VC-dimension
Massart, Pascal; Nédélec, Élodie. Risk bounds for statistical learning. Ann. Statist. 34 (2006), no. 5, 2326--2366. doi:10.1214/009053606000000786. https://projecteuclid.org/euclid.aos/1169571799.