Open Access
February 2008 Approximation and learning by greedy algorithms
Andrew R. Barron, Albert Cohen, Wolfgang Dahmen, Ronald A. DeVore
Ann. Statist. 36(1): 64-94 (February 2008). DOI: 10.1214/009053607000000631

Abstract

We consider the problem of approximating a given element f from a Hilbert space $\mathcal{H}$ by means of greedy algorithms and the application of such procedures to the regression problem in statistical learning theory. We improve on the existing theory of convergence rates for both the orthogonal greedy algorithm and the relaxed greedy algorithm, as well as for the forward stepwise projection algorithm. For all these algorithms, we prove convergence results for a variety of function classes and not simply those that are related to the convex hull of the dictionary. We then show how these bounds for convergence rates lead to a new theory for the performance of greedy algorithms in learning. In particular, we build upon the results in [IEEE Trans. Inform. Theory 42 (1996) 2118–2132] to construct learning algorithms based on greedy approximations which are universally consistent and provide provable convergence rates for large classes of functions. The use of greedy algorithms in the context of learning is very appealing since it greatly reduces the computational burden when compared with standard model selection using general dictionaries.

Citation

Download Citation

Andrew R. Barron. Albert Cohen. Wolfgang Dahmen. Ronald A. DeVore. "Approximation and learning by greedy algorithms." Ann. Statist. 36 (1) 64 - 94, February 2008. https://doi.org/10.1214/009053607000000631

Information

Published: February 2008
First available in Project Euclid: 1 February 2008

zbMATH: 1138.62019
MathSciNet: MR2387964
Digital Object Identifier: 10.1214/009053607000000631

Subjects:
Primary: 41A46 , 41A63 , 46N30 , 62G07

Keywords: convergence rates for greedy algorithms , interpolation spaces , neural networks , Nonparametric regression , Statistical learning

Rights: Copyright © 2008 Institute of Mathematical Statistics

Vol.36 • No. 1 • February 2008
Back to Top