Abstract
The popular cubic smoothing spline estimate of a regression function arises as the minimizer of the penalized sum of squares $\sum_{j}(Y_{j}-\mu(t_{j}))^{2}+\lambda \int_{a}^{b}[\mu''(t)]^{2}\,dt$, where the data are $t_{j},Y_{j}$, $j=1,\ldots,n$. The minimization is taken over an infinite-dimensional function space, the space of all functions with square integrable second derivatives. But the calculations can be carried out in a finite-dimensional space. The reduction from minimizing over an infinite dimensional space to minimizing over a finite dimensional space occurs for more general objective functions: the data may be related to the function $\mu$ in another way, the sum of squares may be replaced by a more suitable expression, or the penalty, $\int_{a}^{b}[\mu''(t)]^{2}\,dt$, might take a different form. This paper reviews the Reproducing Kernel Hilbert Space structure that provides a finite-dimensional solution for a general minimization problem. Particular attention is paid to the construction and study of the Reproducing Kernel Hilbert Space corresponding to a penalty based on a linear differential operator. In this case, one can often calculate the minimizer explicitly, using Green’s functions.
Citation
Nancy Heckman. "The theory and application of penalized methods or Reproducing Kernel Hilbert Spaces made easy." Statist. Surv. 6 113 - 141, 2012. https://doi.org/10.1214/12-SS101
Information