The Annals of Statistics

Accuracy guaranties for $\ell_{1}$ recovery of block-sparse signals

Anatoli Juditsky, Fatma Kılınç Karzan, Arkadi Nemirovski, and Boris Polyak

Full-text: Open access

Abstract

We introduce a general framework to handle structured models (sparse and block-sparse with possibly overlapping blocks). We discuss new methods for their recovery from incomplete observation, corrupted with deterministic and stochastic noise, using block-$\ell_{1}$ regularization. While the current theory provides promising bounds for the recovery errors under a number of different, yet mostly hard to verify conditions, our emphasis is on verifiable conditions on the problem parameters (sensing matrix and the block structure) which guarantee accurate recovery. Verifiability of our conditions not only leads to efficiently computable bounds for the recovery error but also allows us to optimize these error bounds with respect to the method parameters, and therefore construct estimators with improved statistical properties. To justify our approach, we also provide an oracle inequality, which links the properties of the proposed recovery algorithms and the best estimation performance. Furthermore, utilizing these verifiable conditions, we develop a computationally cheap alternative to block-$\ell_{1}$ minimization, the non-Euclidean Block Matching Pursuit algorithm. We close by presenting a numerical study to investigate the effect of different block regularizations and demonstrate the performance of the proposed recoveries.

Article information

Source
Ann. Statist. Volume 40, Number 6 (2012), 3077-3107.

Dates
First available in Project Euclid: 22 February 2013

Permanent link to this document
https://projecteuclid.org/euclid.aos/1361542075

Digital Object Identifier
doi:10.1214/12-AOS1057

Mathematical Reviews number (MathSciNet)
MR3097970

Zentralblatt MATH identifier
1296.62088

Subjects
Primary: 62G08: Nonparametric regression 62H12: Estimation
Secondary: 90C90: Applications of mathematical programming

Keywords
Sparse recovery nonparametric estimation by convex optimization oracle inequalities

Citation

Juditsky, Anatoli; Kılınç Karzan, Fatma; Nemirovski, Arkadi; Polyak, Boris. Accuracy guaranties for $\ell_{1}$ recovery of block-sparse signals. Ann. Statist. 40 (2012), no. 6, 3077--3107. doi:10.1214/12-AOS1057. https://projecteuclid.org/euclid.aos/1361542075.


Export citation

References

  • [1] Argyriou, A., Evgeniou, T. and Pontil, M. (2008). Convex multi-task feature learning. Machine Learning 73 243–272.
  • [2] Bach, F. R. (2008). Consistency of the group lasso and multiple kernel learning. J. Mach. Learn. Res. 9 1179–1225.
  • [3] Baraniuk, R. G., Cevher, V., Duarte, M. F. and Hegde, C. (2010). Model-based compressive sensing. IEEE Trans. Inform. Theory 56 1982–2001.
  • [4] Ben-Haim, Z. and Eldar, Y. (2011). Near-oracle performance of greedy block-sparse estimation techniques from noisy measurements. IEEE J. Selected Topics in Signal Processing 5 1032–1047.
  • [5] Bickel, P. J., Ritov, Y. and Tsybakov, A. B. (2009). Simultaneous analysis of lasso and Dantzig selector. Ann. Statist. 37 1705–1732.
  • [6] Candes, E. and Tao, T. (2007). The Dantzig selector: Statistical estimation when $p$ is much larger than $n$. Ann. Statist. 35 2313–2351.
  • [7] Candès, E. J. (2008). The restricted isometry property and its implications for compressed sensing. C. R. Math. Acad. Sci. Paris 346 589–592.
  • [8] Candes, E. J. and Tao, T. (2005). Decoding by linear programming. IEEE Trans. Inform. Theory 51 4203–4215.
  • [9] Chesneau, C. and Hebiri, M. (2008). Some theoretical results on the grouped variables Lasso. Math. Methods Statist. 17 317–326.
  • [10] Donoho, D. L., Elad, M. and Temlyakov, V. N. (2006). Stable recovery of sparse overcomplete representations in the presence of noise. IEEE Trans. Inform. Theory 52 6–18.
  • [11] Duarte, M., Bajwa, W. and Calderbank, R. (2011). The performance of group Lasso for linear regression of grouped variables. Technical report 2010-10, Dept. Computer Science, Duke Univ., Durham, NC. Available at http://www.rci.rutgers.edu/~wub1/pubs/sampta11_tr.pdf.
  • [12] Eldar, Y. C., Kuppinger, P. and Bölcskei, H. (2010). Block-sparse signals: Uncertainty relations and efficient recovery. IEEE Trans. Signal Process. 58 3042–3054.
  • [13] Eldar, Y. C. and Mishali, M. (2009). Robust recovery of signals from a structured union of subspaces. IEEE Trans. Inform. Theory 55 5302–5316.
  • [14] Gribonval, R. and Nielsen, M. (2003). Sparse representations in unions of bases. IEEE Trans. Inform. Theory 49 3320–3325.
  • [15] Huang, J. and Zhang, T. (2010). The benefit of group sparsity. Ann. Statist. 38 1978–2004.
  • [16] James, G. M., Radchenko, P. and Lv, J. (2009). DASSO: Connections between the Dantzig selector and lasso. J. R. Stat. Soc. Ser. B Stat. Methodol. 71 127–142.
  • [17] Juditsky, A., Kilinç Karzan, F. and Nemirovski, A. (2011). On low rank matrix approximations with applications to synthesis problem in compressed sensing. SIAM J. Matrix Anal. Appl. 32 1019–1029.
  • [18] Juditsky, A., Kilinç Karzan, F., Nemirovski, A. and Polyak, V. (2013). Supplement to “Accuracy guaranties for $\ell_1$ recovery of block-sparse signals.” DOI:10.1214/12-AOS1057SUPP.
  • [19] Juditsky, A. and Nemirovski, A. (2011). Accuracy guarantees for $\ell_1$-recovery. IEEE Trans. Inform. Theory 57 7818–7839.
  • [20] Juditsky, A. B., Kılınç-Karzan, F. and Nemirovski, A. S. (2011). Verifiable conditions of $\ell_1$ recovery for sparse signals with sign restrictions. Math. Program. 127 89–122.
  • [21] Juditsky, A. B. and Nemirovski, A. S. (2011). On verifiable sufficient conditions for sparse signal recovery via $\ell_1$ minimization. Math. Program. 127 57–88.
  • [22] Liu, H. and Zhang, J. (2009). Estimation consistency of the group Lasso and its applications. J. Mach. Learn. Res. Proceedings Track 5 376–383.
  • [23] Liu, H., Zhang, J., Jiang, X. and Liu, J. (2010). The group Dantzig selector. J. Mach. Learn. Res. Proceedings Track 9 461–468.
  • [24] Lounici, K., Pontil, M., van de Geer, S. and Tsybakov, A. B. (2011). Oracle inequalities and optimal inference under group sparsity. Ann. Statist. 39 2164–2204.
  • [25] Meier, L., van de Geer, S. and Bühlmann, P. (2008). The group Lasso for logistic regression. J. R. Stat. Soc. Ser. B Stat. Methodol. 70 53–71.
  • [26] Nardi, Y. and Rinaldo, A. (2008). On the asymptotic properties of the group lasso estimator for linear models. Electron. J. Stat. 2 605–633.
  • [27] Obozinski, G., Wainwright, M. J. and Jordan, M. I. (2011). Support union recovery in high-dimensional multivariate regression. Ann. Statist. 39 1–47.
  • [28] Parvaresh, F., Vikalo, H., Misra, S. and Hassibi, B. (2008). Recovering sparse signals using sparse measurement matrices in compressed DNA microarrays. IEEE J. Selected Topics in Signal Processing 2 275–285.
  • [29] Pfetsch, M. E. and Tillmann, A. M. (2012). The computational complexity of the restricted isometry property, the nullspace property, and related concepts in compressed sensing. Technical report. Available at http://arxiv.org/abs/1205.2081.
  • [30] Stojnic, M., Parvaresh, F. and Hassibi, B. (2009). On the reconstruction of block-sparse signals with an optimal number of measurements. IEEE Trans. Signal Process. 57 3075–3085.
  • [31] van de Geer, S. A. and Bühlmann, P. (2009). On the conditions used to prove oracle results for the Lasso. Electron. J. Stat. 3 1360–1392.
  • [32] Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. J. R. Stat. Soc. Ser. B Stat. Methodol. 68 49–67.

Supplemental materials

  • Supplementary material: Supplement to “Accuracy guaranties for $\ell_{1}$ recovery of block-sparse signals”. The proofs of the results stated in the paper and the derivations for Section 5.2 are provided in the supplementary article [18].