Abstract
We consider the classical problem of estimating the covariance matrix of a sub-Gaussian distribution from i.i.d. samples in the novel context of coarse quantization, that is, instead of having full knowledge of the samples, they are quantized to one or two bits per entry. This problem occurs naturally in signal processing applications. We introduce new estimators in two different quantization scenarios and derive nonasymptotic estimation error bounds in terms of the operator norm. In the first scenario, we consider a simple, scale-invariant one-bit quantizer and derive an estimation result for the correlation matrix of a centered Gaussian distribution. In the second scenario, we add random dithering to the quantizer. In this case, we can accurately estimate the full covariance matrix of a general sub-Gaussian distribution by collecting two bits per entry of each sample. In both scenarios, our bounds apply to masked covariance estimation. We demonstrate the near optimality of our error bounds by deriving corresponding (minimax) lower bounds and using numerical simulations.
Funding Statement
The authors were supported by the DFG through the project CoCoMIMO funded within the priority program SPP 1798 Compressed Sensing in Information Processing (COSIP).
Acknowledgments
The authors are very grateful to the Associate Editor and the anonymous reviewer for their detailed comments, which led to several improvements in this work.
Citation
Sjoerd Dirksen. Johannes Maly. Holger Rauhut. "Covariance estimation under one-bit quantization." Ann. Statist. 50 (6) 3538 - 3562, December 2022. https://doi.org/10.1214/22-AOS2239
Information