We propose new bounds on the error of learning algorithms in terms of a data-dependent notion of complexity. The estimates we establish give optimal rates and are based on a local and empirical version of Rademacher averages, in the sense that the Rademacher averages are computed from the data, on a subset of functions with small empirical error. We present some applications to classification and prediction with convex function classes, and with kernel classes in particular.
"Local Rademacher complexities." Ann. Statist. 33 (4) 1497 - 1537, August 2005. https://doi.org/10.1214/009053605000000282