## Electronic Journal of Statistics

- Electron. J. Statist.
- Volume 11, Number 1 (2017), 1022-1047.

### Kernel ridge vs. principal component regression: Minimax bounds and the qualification of regularization operators

Lee H. Dicker, Dean P. Foster, and Daniel Hsu

#### Abstract

Regularization is an essential element of virtually all kernel methods for nonparametric regression problems. A critical factor in the effectiveness of a given kernel method is the type of regularization that is employed. This article compares and contrasts members from a general class of regularization techniques, which notably includes ridge regression and principal component regression. We derive an explicit finite-sample risk bound for regularization-based estimators that simultaneously accounts for (i) the structure of the ambient function space, (ii) the regularity of the true regression function, and (iii) the adaptability (or *qualification*) of the regularization. A simple consequence of this upper bound is that the risk of the regularization-based estimators matches the minimax rate in a variety of settings. The general bound also illustrates how some regularization techniques are more adaptable than others to favorable regularity properties that the true regression function may possess. This, in particular, demonstrates a striking difference between kernel ridge regression and kernel principal component regression. Our theoretical results are supported by numerical experiments.

#### Article information

**Source**

Electron. J. Statist., Volume 11, Number 1 (2017), 1022-1047.

**Dates**

Received: August 2016

First available in Project Euclid: 30 March 2017

**Permanent link to this document**

https://projecteuclid.org/euclid.ejs/1490860815

**Digital Object Identifier**

doi:10.1214/17-EJS1258

**Mathematical Reviews number (MathSciNet)**

MR3629418

**Zentralblatt MATH identifier**

1362.62087

**Subjects**

Primary: 62G08: Nonparametric regression

**Keywords**

Nonparametric regression reproducing kernel Hilbert space

**Rights**

Creative Commons Attribution 4.0 International License.

#### Citation

Dicker, Lee H.; Foster, Dean P.; Hsu, Daniel. Kernel ridge vs. principal component regression: Minimax bounds and the qualification of regularization operators. Electron. J. Statist. 11 (2017), no. 1, 1022--1047. doi:10.1214/17-EJS1258. https://projecteuclid.org/euclid.ejs/1490860815