Open Access
August 2009 Fast learning rates in statistical inference through aggregation
Jean-Yves Audibert
Ann. Statist. 37(4): 1591-1646 (August 2009). DOI: 10.1214/08-AOS623

Abstract

We develop minimax optimal risk bounds for the general learning task consisting in predicting as well as the best function in a reference set $\mathcal{G}$ up to the smallest possible additive term, called the convergence rate. When the reference set is finite and when n denotes the size of the training data, we provide minimax convergence rates of the form $C(\frac{\log|\mathcal{G}|}{n})^{v}$ with tight evaluation of the positive constant C and with exact 0<v≤1, the latter value depending on the convexity of the loss function and on the level of noise in the output distribution.

The risk upper bounds are based on a sequential randomized algorithm, which at each step concentrates on functions having both low risk and low variance with respect to the previous step prediction function. Our analysis puts forward the links between the probabilistic and worst-case viewpoints, and allows to obtain risk bounds unachievable with the standard statistical learning approach. One of the key ideas of this work is to use probabilistic inequalities with respect to appropriate (Gibbs) distributions on the prediction function space instead of using them with respect to the distribution generating the data.

The risk lower bounds are based on refinements of the Assouad lemma taking particularly into account the properties of the loss function. Our key example to illustrate the upper and lower bounds is to consider the Lq-regression setting for which an exhaustive analysis of the convergence rates is given while q ranges in [1; +∞[.

Citation

Download Citation

Jean-Yves Audibert. "Fast learning rates in statistical inference through aggregation." Ann. Statist. 37 (4) 1591 - 1646, August 2009. https://doi.org/10.1214/08-AOS623

Information

Published: August 2009
First available in Project Euclid: 18 June 2009

zbMATH: 1360.62167
MathSciNet: MR2533466
Digital Object Identifier: 10.1214/08-AOS623

Subjects:
Primary: 62G08
Secondary: 62H05 , 68T10

Keywords: Aggregation , convex loss , excess risk , fast rates of convergence , L_q-regression , lower bounds in VC-classes , minimax lower bounds , Statistical learning

Rights: Copyright © 2009 Institute of Mathematical Statistics

Vol.37 • No. 4 • August 2009
Back to Top