Open Access
Translator Disclaimer
2022 Tensor factorization recommender systems with dependency
Jiuchen Zhang, Yubai Yuan, Annie Qu
Author Affiliations +
Electron. J. Statist. 16(1): 2175-2205 (2022). DOI: 10.1214/22-EJS1978

Abstract

Dependency structure in recommender systems has been widely adopted in recent years to improve prediction accuracy. In this paper, we propose an innovative tensor-based recommender system, namely, the Tensor Factorization with Dependency (TFD). The proposed method utilizes shared factors to characterize the dependency between different modes, in addition to pairwise additive tensor factorization to integrate information among multiple modes. One advantage of the proposed method is that it provides flexibility for different dependency structures by incorporating shared latent factors. In addition, the proposed method unifies both binary and ordinal ratings in recommender systems. We achieve scalable computation for scarce tensors with high missing rates. In theory, we show the asymptotic consistency of estimators with various loss functions for both binary and ordinal data. Our numerical studies demonstrate that the proposed method outperforms the existing methods, especially on prediction accuracy.

Funding Statement

This work is supported by NSF grant DMS 1952406.

Acknowledgments

The authors thank the editor, the associate editor, and reviewers for providing thoughtful comments and suggestions.

Citation

Download Citation

Jiuchen Zhang. Yubai Yuan. Annie Qu. "Tensor factorization recommender systems with dependency." Electron. J. Statist. 16 (1) 2175 - 2205, 2022. https://doi.org/10.1214/22-EJS1978

Information

Received: 1 August 2021; Published: 2022
First available in Project Euclid: 29 March 2022

Digital Object Identifier: 10.1214/22-EJS1978

Subjects:
Primary: 62G05 , 62H25
Secondary: 62P20

Keywords: context-aware recommender system , dependency among modes , parsimonious tensor decomposition , shared latent factor

JOURNAL ARTICLE
31 PAGES


SHARE
Vol.16 • No. 1 • 2022
Back to Top