Open Access
April 2007 Fast rates for support vector machines using Gaussian kernels
Ingo Steinwart, Clint Scovel
Ann. Statist. 35(2): 575-607 (April 2007). DOI: 10.1214/009053606000001226

Abstract

For binary classification we establish learning rates up to the order of n−1 for support vector machines (SVMs) with hinge loss and Gaussian RBF kernels. These rates are in terms of two assumptions on the considered distributions: Tsybakov’s noise assumption to establish a small estimation error, and a new geometric noise condition which is used to bound the approximation error. Unlike previously proposed concepts for bounding the approximation error, the geometric noise assumption does not employ any smoothness assumption.

Citation

Download Citation

Ingo Steinwart. Clint Scovel. "Fast rates for support vector machines using Gaussian kernels." Ann. Statist. 35 (2) 575 - 607, April 2007. https://doi.org/10.1214/009053606000001226

Information

Published: April 2007
First available in Project Euclid: 5 July 2007

zbMATH: 1127.68091
MathSciNet: MR2336860
Digital Object Identifier: 10.1214/009053606000001226

Subjects:
Primary: 68Q32
Secondary: 41A46 , 41A99 , 62G20 , 62G99 , 68T05 , 68T10

Keywords: ‎classification‎ , Gaussian RBF kernels , learning rates , noise assumption , nonlinear discrimination , Support vector machines

Rights: Copyright © 2007 Institute of Mathematical Statistics

Vol.35 • No. 2 • April 2007
Back to Top