Abstract
We develop an approach for feature elimination in statistical learning with kernel machines, based on recursive elimination of features. We present theoretical properties of this method and show that it is uniformly consistent in finding the correct feature space under certain generalized assumptions. We present a few case studies to show that the assumptions are met in most practical situations and present simulation results to demonstrate performance of the proposed approach.
Citation
Sayan Dasgupta. Yair Goldberg. Michael R. Kosorok. "Feature elimination in kernel machines in moderately high dimensions." Ann. Statist. 47 (1) 497 - 526, February 2019. https://doi.org/10.1214/18-AOS1696
Information