Open Access
October 2006 Risk bounds for statistical learning
Pascal Massart, Élodie Nédélec
Ann. Statist. 34(5): 2326-2366 (October 2006). DOI: 10.1214/009053606000000786

Abstract

We propose a general theorem providing upper bounds for the risk of an empirical risk minimizer (ERM).We essentially focus on the binary classification framework. We extend Tsybakov’s analysis of the risk of an ERM under margin type conditions by using concentration inequalities for conveniently weighted empirical processes. This allows us to deal with ways of measuring the “size” of a class of classifiers other than entropy with bracketing as in Tsybakov’s work. In particular, we derive new risk bounds for the ERM when the classification rules belong to some VC-class under margin conditions and discuss the optimality of these bounds in a minimax sense.

Citation

Download Citation

Pascal Massart. Élodie Nédélec. "Risk bounds for statistical learning." Ann. Statist. 34 (5) 2326 - 2366, October 2006. https://doi.org/10.1214/009053606000000786

Information

Published: October 2006
First available in Project Euclid: 23 January 2007

zbMATH: 1108.62007
MathSciNet: MR2291502
Digital Object Identifier: 10.1214/009053606000000786

Subjects:
Primary: 60E15
Secondary: 60F10 , 94A17

Keywords: ‎classification‎ , Concentration inequalities , Empirical processes , entropy with bracketing , minimax estimation , Model selection , pattern recognition , Regression estimation , Statistical learning , structural minimization of the risk , VC-class , VC-dimension

Rights: Copyright © 2006 Institute of Mathematical Statistics

Vol.34 • No. 5 • October 2006
Back to Top