Abstract
The asymptotic optimality of Mallows' $C_L$ and generalized cross-validation is demonstrated in the setting of ridge regression. An application is made to spline smoothing in nonparametric regression. A counterexample is given to help understand why sometimes GCV may not be asymptotically optimal. The coefficient of variation for the eigenvalues of the information matrix must be large in order to guarantee the optimality of GCV. The proff is based on the connection between GCV and Stein's unbiased risk estimate.
Citation
Ker-Chau Li. "Asymptotic Optimality of $C_L$ and Generalized Cross-Validation in Ridge Regression with Application to Spline Smoothing." Ann. Statist. 14 (3) 1101 - 1112, September, 1986. https://doi.org/10.1214/aos/1176350052
Information