Open Access
2013 Density Problem and Approximation Error in Learning Theory
Ding-Xuan Zhou
Abstr. Appl. Anal. 2013(SI32): 1-13 (2013). DOI: 10.1155/2013/715683

Abstract

We study the density problem and approximation error of reproducing kernel Hilbert spaces for the purpose of learning theory. For a Mercer kernel K on a compact metric space ( X , d ), a characterization for the generated reproducing kernel Hilbert space (RKHS) K to be dense in C(X) is given. As a corollary, we show that the density is always true for convolution type kernels. Some estimates for the rate of convergence of interpolation schemes are presented for general Mercer kernels. These are then used to establish for convolution type kernels quantitative analysis for the approximation error in learning theory. Finally, we show by the example of Gaussian kernels with varying variances that the approximation error can be improved when we adaptively change the value of the parameter for the used kernel. This confirms the method of choosing varying parameters which is used often in many applications of learning theory.

Citation

Download Citation

Ding-Xuan Zhou. "Density Problem and Approximation Error in Learning Theory." Abstr. Appl. Anal. 2013 (SI32) 1 - 13, 2013. https://doi.org/10.1155/2013/715683

Information

Published: 2013
First available in Project Euclid: 26 February 2014

zbMATH: 07095270
MathSciNet: MR3111814
Digital Object Identifier: 10.1155/2013/715683

Rights: Copyright © 2013 Hindawi

Vol.2013 • No. SI32 • 2013
Back to Top