The Annals of Statistics

Solution of linear ill-posed problems using overcomplete dictionaries

Marianna Pensky

Full-text: Access denied (no subscription detected)

We're sorry, but we are unable to provide you with the full text of this article because we are not able to identify you as a subscriber. If you have a personal subscription to this journal, then please login. If you are already logged in, then you may need to update your profile to register your subscription. Read more about accessing full-text

Abstract

In the present paper, we consider the application of overcomplete dictionaries to the solution of general ill-posed linear inverse problems. In the context of regression problems, there has been an enormous amount of effort to recover an unknown function using an overcomplete dictionary. One of the most popular methods, Lasso and its variants, is based on maximizing the likelihood, and relies on stringent assumptions on the dictionary, the so-called compatibility conditions, for a proof of its convergence rates. While these conditions may be satisfied for the original dictionary functions, they usually do not hold for their images due to contraction properties imposed by the linear operator.

In what follows, we bypass this difficulty by a novel approach, which is based on inverting each of the dictionary functions and matching the resulting expansion to the true function, thus, avoiding unrealistic assumptions on the dictionary and using Lasso in a predictive setting. We examine both the white noise and the observational model formulations, and also discuss how exact inverse images of the dictionary functions can be replaced by their approximate counterparts. Furthermore, we show how the suggested methodology can be extended to the problem of estimation of a mixing density in a continuous mixture. For all the situations listed above, we provide sharp oracle inequalities for the risk in a non-asymptotic setting.

Article information

Source
Ann. Statist., Volume 44, Number 4 (2016), 1739-1764.

Dates
Received: March 2015
Revised: January 2016
First available in Project Euclid: 7 July 2016

Permanent link to this document
https://projecteuclid.org/euclid.aos/1467894714

Digital Object Identifier
doi:10.1214/16-AOS1445

Mathematical Reviews number (MathSciNet)
MR3519939

Zentralblatt MATH identifier
1346.62061

Subjects
Primary: 62G05: Estimation
Secondary: 62C10: Bayesian problems; characterization of Bayes procedures

Keywords
Linear inverse problems Lasso adaptive estimation oracle inequality

Citation

Pensky, Marianna. Solution of linear ill-posed problems using overcomplete dictionaries. Ann. Statist. 44 (2016), no. 4, 1739--1764. doi:10.1214/16-AOS1445. https://projecteuclid.org/euclid.aos/1467894714


Export citation

References

  • Abramovich, F. and Silverman, B. W. (1998). Wavelet decomposition approaches to statistical inverse problems. Biometrika 85 115–129.
  • Bickel, P. J., Ritov, Y. and Tsybakov, A. B. (2009). Simultaneous analysis of lasso and Dantzig selector. Ann. Statist. 37 1705–1732.
  • Bühlmann, P. and van de Geer, S. (2011). Statistics for High-Dimensional Data: Methods, Theory and Applications. Springer, Heidelberg.
  • Bunea, F., Tsybakov, A. and Wegkamp, M. (2007). Sparsity oracle inequalities for the Lasso. Electron. J. Stat. 1 169–194.
  • Bunea, F., Tsybakov, A. B., Wegkamp, M. H. and Barbu, A. (2010). Spades and mixture models. Ann. Statist. 38 2525–2558.
  • Candès, E. J. (2003). Ridgelets: Estimating with ridge functions. Ann. Statist. 31 1561–1599.
  • Cavalier, L. and Golubev, Yu. (2006). Risk hull method and regularization by projections of ill-posed inverse problems. Ann. Statist. 34 1653–1677.
  • Cavalier, L. and Reiß, M. (2014). Sparse model selection under heterogeneous noise: Exact penalisation and data-driven thresholding. Electron. J. Stat. 8 432–455.
  • Cavalier, L., Golubev, G. K., Picard, D. and Tsybakov, A. B. (2002). Oracle inequalities for inverse problems. Ann. Statist. 30 843–874. Dedicated to the memory of Lucien Le Cam.
  • Cohen, A., Hoffmann, M. and Reiß, M. (2004). Adaptive wavelet Galerkin methods for linear inverse problems. SIAM J. Numer. Anal. 42 1479–1501 (electronic).
  • Comte, F. and Genon-Catalot, V. (2015). Adaptive Laguerre density estimation for mixed Poisson models. Electron. J. Stat. 9 1113–1149.
  • Dalalyan, A. S., Hebiri, M. and Lederer, J. (2014). On the prediction performance of the Lasso. Preprint. Available at arXiv:1402.1700.
  • Dalalyan, A. S. and Salmon, J. (2012). Sharp oracle inequalities for aggregation of affine estimators. Ann. Statist. 40 2327–2355.
  • Donoho, D. L. (1995). Nonlinear solution of linear inverse problems by wavelet-vaguelette decomposition. Appl. Comput. Harmon. Anal. 2 101–126.
  • Efromovich, S. and Koltchinskii, V. (2001). On inverse problems with unknown operators. IEEE Trans. Inform. Theory 47 2876–2894.
  • Golubev, Y. (2010). On universal oracle inequalities related to high-dimensional linear models. Ann. Statist. 38 2751–2780.
  • Goutis, C. (1997). Nonparametric estimation of a mixing density via the kernel method. J. Amer. Statist. Assoc. 92 1445–1450.
  • Gupta, A. K. and Nagar, D. K. (2000). Matrix Variate Distributions. Chapman & Hall/CRC, Boca Raton, FL.
  • Hengartner, N. W. (1997). Adaptive demixing in Poisson mixture models. Ann. Statist. 25 917–928.
  • Hoffmann, M. and Reiss, M. (2008). Nonlinear estimation for linear inverse problems with error in the operator. Ann. Statist. 36 310–336.
  • Kalifa, J. and Mallat, S. (2003). Thresholding estimators for linear inverse problems and deconvolutions. Ann. Statist. 31 58–109.
  • Le Pennec, E. and Mallat, S. (2005). Sparse geometric image representations with bandelets. IEEE Trans. Image Process. 14 423–438.
  • Liu, L., Levine, M. and Zhu, Y. (2009). A functional EM algorithm for mixing density estimation via nonparametric penalized likelihood maximization. J. Comput. Graph. Statist. 18 481–504.
  • Lounici, K., Pontil, M., van de Geer, S. and Tsybakov, A. B. (2011). Oracle inequalities and optimal inference under group sparsity. Ann. Statist. 39 2164–2204.
  • Meister, A. (2009). Deconvolution Problems in Nonparametric Statistics. Lecture Notes in Statistics 193. Springer, Berlin.
  • Vershynin, R. (2012). Introduction to the non-asymptotic analysis of random matrices. In Compressed Sensing 210–268. Cambridge Univ. Press, Cambridge.
  • Walter, G. G. (1981). Orthogonal series estimators of the prior distribution. Sankhyā Ser. A 43 228–245.
  • Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. J. R. Stat. Soc. Ser. B Stat. Methodol. 68 49–67.