Hiroshima Mathematical Journal
- Hiroshima Math. J.
- Volume 43, Number 2 (2013), 149-177.
A class of cross-validatory model selection criteria
In this paper, we define a class of cross-validatory model selection criteria as an estimator of the predictive risk function based on a discrepancy between a candidate model and the true model. For a vector of unknown parameters, $n$ estimators are required for the definition of the class, where $n$ is the sample size. The $i$th estimator $(i=1,\dots,n)$ is obtained by minimizing a weighted discrepancy function in which the $i$th observation has a weight of $1-\lambda$ and others have weight of $1$. Cross-validatory model selection criteria in the class are specified by the individual $\lambda$. The sample discrepancy function and the ordinary cross-validation (CV) criterion are special cases of the class. One may choose $\lambda$ to minimize the biases. The optimal $\lambda$ makes the bias-corrected CV (CCV) criterion a second-order unbiased estimator for the risk function, while the ordinary CV criterion is a first-order unbiased estimator of the risk function.
Hiroshima Math. J., Volume 43, Number 2 (2013), 149-177.
First available in Project Euclid: 25 June 2013
Permanent link to this document
Digital Object Identifier
Mathematical Reviews number (MathSciNet)
Zentralblatt MATH identifier
Primary: 62H25: Factor analysis and principal components; correspondence analysis
Secondary: 62F07: Ranking and selection
Yanagihara, Hirokazu; Yuan, Ke-Hai; Fujisawa, Hironori; Hayashi, Kentaro. A class of cross-validatory model selection criteria. Hiroshima Math. J. 43 (2013), no. 2, 149--177. doi:10.32917/hmj/1372180510. https://projecteuclid.org/euclid.hmj/1372180510