Open Access
Translator Disclaimer
February 2004 Process consistency for AdaBoost
Wenxin Jiang
Ann. Statist. 32(1): 13-29 (February 2004). DOI: 10.1214/aos/1079120128

Abstract

Recent experiments and theoretical studies show that AdaBoost can overfit in the limit of large time. If running the algorithm forever is suboptimal, a natural question is how low can the prediction error be during the process of AdaBoost? We show under general regularity conditions that during the process of AdaBoost a consistent prediction is generated, which has the prediction error approximating the optimal Bayes error as the sample size increases. This result suggests that, while running the algorithm forever can be suboptimal, it is reasonable to expect that some regularization method via truncation of the process may lead to a near-optimal performance for sufficiently large sample size.

Citation

Download Citation

Wenxin Jiang. "Process consistency for AdaBoost." Ann. Statist. 32 (1) 13 - 29, February 2004. https://doi.org/10.1214/aos/1079120128

Information

Published: February 2004
First available in Project Euclid: 12 March 2004

zbMATH: 1105.62316
MathSciNet: MR2050999
Digital Object Identifier: 10.1214/aos/1079120128

Subjects:
Primary: 62G99
Secondary: 68T99

Keywords: AdaBoost , Bayes error , boosting , consistency , prediction error , VC dimension

Rights: Copyright © 2004 Institute of Mathematical Statistics

JOURNAL ARTICLE
17 PAGES


SHARE
Vol.32 • No. 1 • February 2004
Back to Top