Abstract
This paper proposes convex relaxation based robust methods to recover approximately low-rank matrices in the presence of heavy-tailed and asymmetric errors, allowing for heteroscedasticity. We focus on three archetypal applications in matrix recovery: matrix compressed sensing, matrix completion and multitask regression. Statistically, we provide sub-Gaussian-type deviation bounds when the noise variables only have bounded variances in each aforementioned setting. Improving upon the earlier results in Fan, Wang and Zhu (Ann. Statist. 49 (2021) 1239–1266), the convergence rates of our estimators are proportional to the noise scale under matrix sensing and multitask regression settings, and thus diminish to 0 in the noiseless case. Computationally, we propose a matrix version of the local adaptive majorize-minimization algorithm, which is much faster than the alternating direction method of multiplier used in previous work and is scalable to large datasets. Numerical experiments demonstrate the advantage of our methods over their non-robust counterparts and corroborate the theoretical findings that the convergence rates are proportional to the noise scale.
Funding Statement
MY and WZ are supported in part by the NSF Grant DMS-2113409. QS is partially supported by Natural Sciences and Engineering Research Council of Canada (Grant RGPIN-2018-06484), a New Frontiers in Research Fund NFRFE-2019-00603, and a Data Sciences Institute Catalyst Grant.
Acknowledgments
We thank the Editor, an Associate Editor, and two anonymous reviewers for their constructive comments and valuable suggestions, which have significantly helped us improve the quality of this work.
Citation
Myeonghun Yu. Qiang Sun. Wen-Xin Zhou. "Low-rank matrix recovery under heavy-tailed errors." Bernoulli 30 (3) 2326 - 2345, August 2024. https://doi.org/10.3150/23-BEJ1675
Information