We consider the problem of adaptation to the margin in binary classification. We suggest a penalized empirical risk minimization classifier that adaptively attains, up to a logarithmic factor, fast optimal rates of convergence for the excess risk, that is, rates that can be faster than n−1/2, where n is the sample size. We show that our method also gives adaptive estimators for the problem of edge estimation.
"Square root penalty: Adaptation to the margin in classification and in edge estimation." Ann. Statist. 33 (3) 1203 - 1224, June 2005. https://doi.org/10.1214/009053604000001066