Translator Disclaimer
2021 Sparse random tensors: Concentration, regularization and applications
Zhixin Zhou, Yizhe Zhu
Author Affiliations +
Electron. J. Statist. 15(1): 2483-2516 (2021). DOI: 10.1214/21-EJS1838

Abstract

We prove a non-asymptotic concentration inequality for the spectral norm of sparse inhomogeneous random tensors with Bernoulli entries. For an order-k inhomogeneous random tensor T with sparsity pmaxclognn, we show that TET=O(npmaxlogk2(n)) with high probability. The optimality of this bound up to polylog factors is provided by an information theoretic lower bound. By tensor unfolding, we extend the range of sparsity to pmaxclognnm with 1mk1 and obtain concentration inequalities for different sparsity regimes. We also provide a simple way to regularize T such that O(nmpmax) concentration still holds down to sparsity pmaxcnm with k2mk1. We present our concentration and regularization results with two applications: (i) a randomized construction of hypergraphs of bounded degrees with good expander mixing properties, (ii) concentration of sparsified tensors under uniform sampling.

Funding Statement

Y.Z. is partially supported by NSF DMS-1949617.

Acknowledgments

We thank anonymous referees for their detailed comments and suggestions, which have improved the quality of this paper. We also thank Arash A. Amini, Nicholas Cook, Ioana Dumitriu, Kameron Decker Harris, and Roman Vershynin for helpful comments.

Citation

Download Citation

Zhixin Zhou. Yizhe Zhu. "Sparse random tensors: Concentration, regularization and applications." Electron. J. Statist. 15 (1) 2483 - 2516, 2021. https://doi.org/10.1214/21-EJS1838

Information

Received: 1 February 2020; Published: 2021
First available in Project Euclid: 3 May 2021

Digital Object Identifier: 10.1214/21-EJS1838

Subjects:
Primary: 15B52
Secondary: 60C05

JOURNAL ARTICLE
34 PAGES


SHARE
Vol.15 • No. 1 • 2021
Back to Top