Open Access
2020 Recovery of simultaneous low rank and two-way sparse coefficient matrices, a nonconvex approach
Ming Yu, Varun Gupta, Mladen Kolar
Electron. J. Statist. 14(1): 413-457 (2020). DOI: 10.1214/19-EJS1658

Abstract

We study the problem of recovery of matrices that are simultaneously low rank and row and/or column sparse. Such matrices appear in recent applications in cognitive neuroscience, imaging, computer vision, macroeconomics, and genetics. We propose a GDT (Gradient Descent with hard Thresholding) algorithm to efficiently recover matrices with such structure, by minimizing a bi-convex function over a nonconvex set of constraints. We show linear convergence of the iterates obtained by GDT to a region within statistical error of an optimal solution. As an application of our method, we consider multi-task learning problems and show that the statistical error rate obtained by GDT is near optimal compared to minimax rate. Experiments demonstrate competitive performance and much faster running speed compared to existing methods, on both simulations and real data sets.

Citation

Download Citation

Ming Yu. Varun Gupta. Mladen Kolar. "Recovery of simultaneous low rank and two-way sparse coefficient matrices, a nonconvex approach." Electron. J. Statist. 14 (1) 413 - 457, 2020. https://doi.org/10.1214/19-EJS1658

Information

Received: 1 January 2019; Published: 2020
First available in Project Euclid: 22 January 2020

zbMATH: 1434.90161
MathSciNet: MR4054252
Digital Object Identifier: 10.1214/19-EJS1658

Subjects:
Primary: 90C26

Keywords: gradient descent with hard thresholding , low rank and two-way sparse coefficient matrix , multi-task learning , nonconvex optimization , two-way sparse reduce rank regression

Vol.14 • No. 1 • 2020
Back to Top