Open Access
2018 A general family of trimmed estimators for robust high-dimensional data analysis
Eunho Yang, Aurélie C. Lozano, Aleksandr Aravkin
Electron. J. Statist. 12(2): 3519-3553 (2018). DOI: 10.1214/18-EJS1470

Abstract

We consider the problem of robustifying high-dimensional structured estimation. Robust techniques are key in real-world applications which often involve outliers and data corruption. We focus on trimmed versions of structurally regularized M-estimators in the high-dimensional setting, including the popular Least Trimmed Squares estimator, as well as analogous estimators for generalized linear models and graphical models, using convex and non-convex loss functions. We present a general analysis of their statistical convergence rates and consistency, and then take a closer look at the trimmed versions of the Lasso and Graphical Lasso estimators as special cases. On the optimization side, we show how to extend algorithms for M-estimators to fit trimmed variants and provide guarantees on their numerical convergence. The generality and competitive performance of high-dimensional trimmed estimators are illustrated numerically on both simulated and real-world genomics data.

Citation

Download Citation

Eunho Yang. Aurélie C. Lozano. Aleksandr Aravkin. "A general family of trimmed estimators for robust high-dimensional data analysis." Electron. J. Statist. 12 (2) 3519 - 3553, 2018. https://doi.org/10.1214/18-EJS1470

Information

Received: 1 March 2018; Published: 2018
First available in Project Euclid: 22 October 2018

zbMATH: 06970011
MathSciNet: MR3866990
Digital Object Identifier: 10.1214/18-EJS1470

Keywords: high-dimensional variable selection , Lasso , robust estimation , Sparse learning

Vol.12 • No. 2 • 2018
Back to Top