Electronic Journal of Statistics

Estimation of Kullback-Leibler losses for noisy recovery problems within the exponential family

Charles-Alban Deledalle

Full-text: Open access

Abstract

We address the question of estimating Kullback-Leibler losses rather than squared losses in recovery problems where the noise is distributed within the exponential family. Inspired by Stein unbiased risk estimator (SURE), we exhibit conditions under which these losses can be unbiasedly estimated or estimated with a controlled bias. Simulations on parameter selection problems in applications to image denoising and variable selection with Gamma and Poisson noises illustrate the interest of Kullback-Leibler losses and the proposed estimators.

Article information

Source
Electron. J. Statist. Volume 11, Number 2 (2017), 3141-3164.

Dates
Received: May 2016
First available in Project Euclid: 29 August 2017

Permanent link to this document
https://projecteuclid.org/euclid.ejs/1503972028

Digital Object Identifier
doi:10.1214/17-EJS1321

Subjects
Primary: 62G05: Estimation 62F10: Point estimation
Secondary: 62J12: Generalized linear models

Keywords
Stein unbiased risk estimator model selection Kullback-Leibler divergence exponential family

Rights
Creative Commons Attribution 4.0 International License.

Citation

Deledalle, Charles-Alban. Estimation of Kullback-Leibler losses for noisy recovery problems within the exponential family. Electron. J. Statist. 11 (2017), no. 2, 3141--3164. doi:10.1214/17-EJS1321. https://projecteuclid.org/euclid.ejs/1503972028.


Export citation

References

  • [1] Akaike, H. (1973). Information theory and an extension of the maximum likelihood principle. In, Second International Symposium on Information Theory 1 267–281. Springer Verlag.
  • [2] Blu, T. and Luisier, F. (2007). The SURE-LET approach to image denoising., IEEE Trans. Image Process. 16 2778–2786.
  • [3] Brown, L. D. (1986). Fundamentals of statistical exponential families with applications in statistical decision theory., Lecture Notes–Monograph Series i–279.
  • [4] Buades, A., Coll, B. and Morel, J. M. (2005). A Review of Image Denoising Algorithms, with a New One., Multiscale Modeling and Simulation 4 490.
  • [5] Cai, T. T. and Zhou, H. H. (2009). A data-driven block thresholding approach to wavelet estimation., The Annals of Statistics 37 569–595.
  • [6] Chaux, C., Duval, L., Benazza-Benyahia, A. and Pesquet, J.-C. (2008). A nonlinear Stein-based estimator for multichannel image denoising., IEEE Trans. on Signal Processing 56 3855–3870.
  • [7] Chen, L. H. Y. (1975). Poisson approximation for dependent trials., The Annals of Probability 3 534–545.
  • [8] Deledalle, C.-A., Denis, L. and Tupin, F. (2012). How to compare noisy patches? Patch similarity beyond Gaussian noise., International J. of Computer Vision 99 86–102.
  • [9] Deledalle, C. A., Duval, V. and Salmon, J. (2011). Non-local Methods with Shape-Adaptive Patches (NLM-SAP)., J. of Mathematical Imaging and Vision 1-18.
  • [10] Deledalle, C.-A., Vaiter, S., Fadili, J. and Peyré, G. (2014). Stein Unbiased GrAdient estimator of the Risk (SUGAR) for multiple parameter selection., SIAM J. Imaging Sci. 7 2448–2487.
  • [11] Donoho, D. L. and Johnstone, I. M. (1995). Adapting to Unknown Smoothness Via Wavelet Shrinkage., J. of the American Statistical Association 90 1200–1224.
  • [12] Duval, V., Aujol, J.-F. and Gousseau, Y. (2011). A bias-variance approach for the Non-Local Means., SIAM J. Imaging Sci. 4 760–788.
  • [13] Efron, B. (1986). How biased is the apparent error rate of a prediction rule?, J. of the American Statistical Association 81 461–470.
  • [14] Eldar, Y. C. (2009). Generalized SURE for exponential families: Applications to regularization., IEEE Trans. Signal Process. 57 471–481.
  • [15] Eldar, Y. C. and Mishali, M. (2009). Robust recovery of signals from a structured union of subspaces., IEEE Trans. on Information Theory 55 5302–5316.
  • [16] Evans, L. C. and Gariepy, R. F. (1992)., Measure theory and fine properties of functions. CRC Press.
  • [17] George, E. I., Liang, F. and Xu, X. (2006). Improved minimax predictive densities under Kullback-Leibler loss., The Annals of Statistics 78–91.
  • [18] Gilbarg, D. and Trudinger, N. S. (1998)., Elliptic Partial Differential Equations of Second Order, 2nd ed. Classics in Mathematics 517. Springer.
  • [19] Girard, A. (1989). A fast Monte-Carlo cross-validation procedure for large least squares problems with noisy data., Numerische Mathematik 56 1–23.
  • [20] Golub, G. H., Heath, M. and Wahba, G. (1979). Generalized cross-validation as a method for choosing a good ridge parameter., Technometrics 215–223.
  • [21] Goodman, J. W. (1976). Some fundamental properties of speckle., J. of the Optical Society of America 66 1145–1150.
  • [22] Hall, P. (1987). On Kullback-Leibler loss and density estimation., The Annals of Statistics 1491–1519.
  • [23] Hamada, M. and Valdez, E. A. (2008). CAPM and option pricing with elliptically contoured distributions., J. of Risk and Insurance 75 387–409.
  • [24] Hannig, J. and Lee, T. (2004). Kernel smoothing of periodograms under Kullback–Leibler discrepancy., Signal Processing 84 1255–1266.
  • [25] Hannig, J. and Lee, T. (2006). On Poisson signal estimation under Kullback–Leibler discrepancy and squared risk., J. of Statistical Planning and Inference 136 882–908.
  • [26] Hudson, H. M. (1978). A natural identity for exponential families with applications in multiparameter estimation., The Annals of Statistics 6 473–484.
  • [27] Kullback, S. and Leibler, R. A. (1951). On information and sufficiency., The Annals of Mathematical Statistics 79–86.
  • [28] Landsman, Z. and Nešlehová, J. (2008). Stein’s Lemma for elliptical random vectors., J. of Multivariate Analysis 99 912–927.
  • [29] Lehmann, E. (1983). Theory of point estimation., Wiley publication.
  • [30] Li, K.-C. (1985). From Stein’s unbiased risk estimates to the method of generalized cross validation., The Annals of Statistics 13 1352–1377.
  • [31] Luisier, F. (2010). The SURE-LET approach to image denoising PhD thesis, École polytechnique fédérale de, lausanne.
  • [32] Luisier, F., Blu, T. and Unser, M. (2010). SURE-LET for orthonormal wavelet-domain video denoising., IEEE Trans. on Circuits and Systems for Video Technology 20 913–919.
  • [33] Luisier, F., Blu, T. and Wolfe, P. J. (2012). A CURE for noisy magnetic resonance images: Chi-square unbiased risk estimation., IEEE Trans. on Image Processing 21 3454–3466.
  • [34] Lv, J. and Liu, J. S. (2014). Model selection principles in misspecified models., J. of the Royal Statistical Society: Series B (Statistical Methodology) 76 141–167.
  • [35] Mallows, C. L. (1973). Some Comments on Cp., Technometrics 15 661–675.
  • [36] Morris, C. N. (1982). Natural exponential families with quadratic variance functions., The Annals of Statistics 65–80.
  • [37] Pesquet, J.-C., Benazza-Benyahia, A. and Chaux, C. (2009). A SURE Approach for Digital Signal/Image Deconvolution Problems., IEEE Trans. on Signal Processing 57 4616–4632.
  • [38] Ramani, S., Blu, T. and Unser, M. (2008). Monte-Carlo SURE: a black-box optimization of regularization parameters for general denoising algorithms., IEEE Trans. Image Process. 17 1540–1554.
  • [39] Ramani, S., Liu, Z., Rosen, J., Nielsen, J.-F. and Fessler, J. A. (2012). Regularization parameter selection for nonlinear iterative image restoration and MRI reconstruction using GCV and SURE-based methods., IEEE Trans. on Image Processing 21 3659–3672.
  • [40] Raphan, M. and Simoncelli, E. P. (2007). Learning to be Bayesian without supervision. In, Advances in Neural Inf. Process. Syst. (NIPS) 19 1145–1152. MIT Press.
  • [41] Rigollet, P. (2012). Kullback–Leibler aggregation and misspecified generalized linear models., The Annals of Statistics 40 639–665.
  • [42] Schwarz, G. (1978). Estimating the dimension of a model., The Annals of Statistics 6 461–464.
  • [43] Stein, C. M. (1981). Estimation of the Mean of a Multivariate Normal Distribution., The Annals of Statistics 9 1135–1151.
  • [44] Tibshirani, R. (1996). Regression shrinkage and selection via the Lasso., J. of the Royal Statistical Society. Series B. Methodological 58 267–288.
  • [45] Vaiter, S., Deledalle, C.-A., Fadili, J., Peyré, G. and Dossal, C. (2017). The Degrees of Freedom of Partly Smooth Regularizers., Annals of the Institute of Statistical Mathematics 69 791–832.
  • [46] Van De Ville, D. and Kocher, M. (2009). SURE-Based Non-Local Means., IEEE Signal Process. Lett. 16 973–976.
  • [47] Van De Ville, D. and Kocher, M. (2011). Non-local means with dimensionality reduction and SURE-based parameter selection., IEEE Trans. Image Process. 9 2683–2690.
  • [48] Yanagimoto, T. (1994). The Kullback-Leibler risk of the Stein estimator and the conditional MLE., Annals of the Institute of Statistical Mathematics 46 29–41.