Translator Disclaimer
2020 Detangling robustness in high dimensions: Composite versus model-averaged estimation
Jing Zhou, Gerda Claeskens, Jelena Bradic
Electron. J. Statist. 14(2): 2551-2599 (2020). DOI: 10.1214/20-EJS1728

Abstract

Robust methods, though ubiquitous in practice, are yet to be fully understood in the context of regularized estimation and high dimensions. Even simple questions become challenging very quickly. For example, classical statistical theory identifies equivalence between model-averaged and composite quantile estimation. However, little to nothing is known about such equivalence between methods that encourage sparsity. This paper provides a toolbox to further study robustness in these settings and focuses on prediction. In particular, we study optimally weighted model-averaged as well as composite $l_{1}$-regularized estimation. Optimal weights are determined by minimizing the asymptotic mean squared error. This approach incorporates the effects of regularization, without the assumption of perfect selection, as is often used in practice. Such weights are then optimal for prediction quality. Through an extensive simulation study, we show that no single method systematically outperforms others. We find, however, that model-averaged and composite quantile estimators often outperform least-squares methods, even in the case of Gaussian model noise. Real data application witnesses the method’s practical use through the reconstruction of compressed audio signals.

Citation

Download Citation

Jing Zhou. Gerda Claeskens. Jelena Bradic. "Detangling robustness in high dimensions: Composite versus model-averaged estimation." Electron. J. Statist. 14 (2) 2551 - 2599, 2020. https://doi.org/10.1214/20-EJS1728

Information

Received: 1 March 2020; Published: 2020
First available in Project Euclid: 14 July 2020

zbMATH: 07235720
MathSciNet: MR4122516
Digital Object Identifier: 10.1214/20-EJS1728

Subjects:
Primary: 62J07
Secondary: 62F12

JOURNAL ARTICLE
49 PAGES


SHARE
Vol.14 • No. 2 • 2020
Back to Top