Open Access
Translator Disclaimer
December 2018 Improved bounds for sparse recovery from subsampled random convolutions
Shahar Mendelson, Holger Rauhut, Rachel Ward
Ann. Appl. Probab. 28(6): 3491-3527 (December 2018). DOI: 10.1214/18-AAP1391


We study the recovery of sparse vectors from subsampled random convolutions via $\ell_{1}$-minimization. We consider the setup in which both the subsampling locations as well as the generating vector are chosen at random. For a sub-Gaussian generator with independent entries, we improve previously known estimates: if the sparsity $s$ is small enough, that is, $s\lesssim\sqrt{n/\log(n)}$, we show that $m\gtrsim s\log(en/s)$ measurements are sufficient to recover $s$-sparse vectors in dimension $n$ with high probability, matching the well-known condition for recovery from standard Gaussian measurements. If $s$ is larger, then essentially $m\geq s\log^{2}(s)\log(\log(s))\log(n)$ measurements are sufficient, again improving over previous estimates. Our results are shown via the so-called robust null space property which is weaker than the standard restricted isometry property. Our method of proof involves a novel combination of small ball estimates with chaining techniques which should be of independent interest.


Download Citation

Shahar Mendelson. Holger Rauhut. Rachel Ward. "Improved bounds for sparse recovery from subsampled random convolutions." Ann. Appl. Probab. 28 (6) 3491 - 3527, December 2018.


Received: 1 October 2016; Revised: 1 March 2018; Published: December 2018
First available in Project Euclid: 8 October 2018

zbMATH: 06994398
MathSciNet: MR3861818
Digital Object Identifier: 10.1214/18-AAP1391

Primary: 94A20
Secondary: 60B20

Keywords: Circulant matrix , compressive sensing , generic chaining , small ball estimates , Sparsity

Rights: Copyright © 2018 Institute of Mathematical Statistics


Vol.28 • No. 6 • December 2018
Back to Top