Open Access
February 2019 Feature elimination in kernel machines in moderately high dimensions
Sayan Dasgupta, Yair Goldberg, Michael R. Kosorok
Ann. Statist. 47(1): 497-526 (February 2019). DOI: 10.1214/18-AOS1696

Abstract

We develop an approach for feature elimination in statistical learning with kernel machines, based on recursive elimination of features. We present theoretical properties of this method and show that it is uniformly consistent in finding the correct feature space under certain generalized assumptions. We present a few case studies to show that the assumptions are met in most practical situations and present simulation results to demonstrate performance of the proposed approach.

Citation

Download Citation

Sayan Dasgupta. Yair Goldberg. Michael R. Kosorok. "Feature elimination in kernel machines in moderately high dimensions." Ann. Statist. 47 (1) 497 - 526, February 2019. https://doi.org/10.1214/18-AOS1696

Information

Received: 1 December 2015; Revised: 1 November 2017; Published: February 2019
First available in Project Euclid: 30 November 2018

zbMATH: 07036209
MathSciNet: MR3909940
Digital Object Identifier: 10.1214/18-AOS1696

Subjects:
Primary: 62G20

Keywords: Kernel machines , recursive feature elimination , Support vector machines , Variable selection

Rights: Copyright © 2019 Institute of Mathematical Statistics

Vol.47 • No. 1 • February 2019
Back to Top