Abstract
Matrix regression plays an important role in modern data analysis due to its ability to handle complex relationships involving both matrix and vector variables. We propose a class of regularized regression models capable of predicting both matrix and vector variables, accommodating various regularization techniques tailored to the inherent structures of the data. We establish the consistency of our estimator when penalizing the nuclear norm of the matrix variable and the norm of the vector variable. To tackle the general regularized regression model, we propose a unified framework based on an efficient preconditioned proximal point algorithm. Numerical experiments demonstrate the superior estimation and prediction accuracy of our proposed estimator, as well as the efficiency of our algorithm compared to the state-of-the-art solvers.
Funding Statement
The research of Meixia Lin was supported by the Ministry of Education, Singapore, under its Academic Research Fund Tier 2 grant call (MOE-T2EP20123-0013) and the Singapore University of Technology and Design under MOE Tier 1 Grant SKI 2021_02_08. The research of Yangjing Zhang was supported by the National Key R&D Program of China under grant number 2023YFA1011100.
Citation
Meixia Lin. Ziyang Zeng. Yangjing Zhang. "Multiple regression for matrix and vector predictors: Models, theory, algorithms, and beyond." Electron. J. Statist. 18 (2) 5563 - 5600, 2024. https://doi.org/10.1214/24-EJS2330
Information