Electronic Journal of Statistics

Estimation of Kullback-Leibler losses for noisy recovery problems within the exponential family

Charles-Alban Deledalle

Full-text: Open access

Abstract

We address the question of estimating Kullback-Leibler losses rather than squared losses in recovery problems where the noise is distributed within the exponential family. Inspired by Stein unbiased risk estimator (SURE), we exhibit conditions under which these losses can be unbiasedly estimated or estimated with a controlled bias. Simulations on parameter selection problems in applications to image denoising and variable selection with Gamma and Poisson noises illustrate the interest of Kullback-Leibler losses and the proposed estimators.

Article information

Source
Electron. J. Statist. Volume 11, Number 2 (2017), 3141-3164.

Dates
Received: May 2016
First available in Project Euclid: 29 August 2017

Permanent link to this document
https://projecteuclid.org/euclid.ejs/1503972028

Digital Object Identifier
doi:10.1214/17-EJS1321

Subjects
Primary: 62G05: Estimation 62F10: Point estimation
Secondary: 62J12: Generalized linear models

Keywords
Stein unbiased risk estimator model selection Kullback-Leibler divergence exponential family

Rights
Creative Commons Attribution 4.0 International License.

Citation

Deledalle, Charles-Alban. Estimation of Kullback-Leibler losses for noisy recovery problems within the exponential family. Electron. J. Statist. 11 (2017), no. 2, 3141--3164. doi:10.1214/17-EJS1321. https://projecteuclid.org/euclid.ejs/1503972028


Export citation

References

  • [1] Akaike, H. (1973). Information theory and an extension of the maximum likelihood principle., InSecond International Symposium on Information Theory1267–281. Springer Verlag.
  • [2] Blu, T. and Luisier, F. (2007). The SURE-LET approach to image, denoising.IEEE Trans. Image Process.162778–2786.
  • [3] Brown, L. D. (1986). Fundamentals of statistical exponential families with applications in statistical decision, theory.Lecture Notes–Monograph Seriesi–279.
  • [4] Buades, A., Coll, B. and Morel, J. M. (2005). A Review of Image Denoising Algorithms, with a New, One.Multiscale Modeling and Simulation4490.
  • [5] Cai, T. T. and Zhou, H. H. (2009). A data-driven block thresholding approach to wavelet, estimation.The Annals of Statistics37569–595.
  • [6] Chaux, C., Duval, L., Benazza-Benyahia, A. and Pesquet, J.-C. (2008). A nonlinear Stein-based estimator for multichannel image, denoising.IEEE Trans. on Signal Processing563855–3870.
  • [7] Chen, L. H. Y. (1975). Poisson approximation for dependent, trials.The Annals of Probability3534–545.
  • [8] Deledalle, C.-A., Denis, L. and Tupin, F. (2012). How to compare noisy patches? Patch similarity beyond Gaussian, noise.International J. of Computer Vision9986–102.
  • [9] Deledalle, C. A., Duval, V. and Salmon, J. (2011). Non-local Methods with Shape-Adaptive Patches, (NLM-SAP).J. of Mathematical Imaging and Vision1-18.
  • [10] Deledalle, C.-A., Vaiter, S., Fadili, J. and Peyré, G. (2014). Stein Unbiased GrAdient estimator of the Risk (SUGAR) for multiple parameter, selection.SIAM J. Imaging Sci.72448–2487.
  • [11] Donoho, D. L. and Johnstone, I. M. (1995). Adapting to Unknown Smoothness Via Wavelet, Shrinkage.J. of the American Statistical Association901200–1224.
  • [12] Duval, V., Aujol, J.-F. and Gousseau, Y. (2011). A bias-variance approach for the Non-Local, Means.SIAM J. Imaging Sci.4760–788.
  • [13] Efron, B. (1986). How biased is the apparent error rate of a prediction, rule?J. of the American Statistical Association81461–470.
  • [14] Eldar, Y. C. (2009). Generalized SURE for exponential families: Applications to, regularization.IEEE Trans. Signal Process.57471–481.
  • [15] Eldar, Y. C. and Mishali, M. (2009). Robust recovery of signals from a structured union of, subspaces.IEEE Trans. on Information Theory555302–5316.
  • [16] Evans, L. C. and Gariepy, R. F., (1992).Measure theory and fine properties of functions. CRC Press.
  • [17] George, E. I., Liang, F. and Xu, X. (2006). Improved minimax predictive densities under Kullback-Leibler, loss.The Annals of Statistics78–91.
  • [18] Gilbarg, D. and Trudinger, N. S., (1998).Elliptic Partial Differential Equations of Second Order, 2nd ed.Classics in Mathematics517. Springer.
  • [19] Girard, A. (1989). A fast Monte-Carlo cross-validation procedure for large least squares problems with noisy, data.Numerische Mathematik561–23.
  • [20] Golub, G. H., Heath, M. and Wahba, G. (1979). Generalized cross-validation as a method for choosing a good ridge, parameter.Technometrics215–223.
  • [21] Goodman, J. W. (1976). Some fundamental properties of, speckle.J. of the Optical Society of America661145–1150.
  • [22] Hall, P. (1987). On Kullback-Leibler loss and density, estimation.The Annals of Statistics1491–1519.
  • [23] Hamada, M. and Valdez, E. A. (2008). CAPM and option pricing with elliptically contoured, distributions.J. of Risk and Insurance75387–409.
  • [24] Hannig, J. and Lee, T. (2004). Kernel smoothing of periodograms under Kullback–Leibler, discrepancy.Signal Processing841255–1266.
  • [25] Hannig, J. and Lee, T. (2006). On Poisson signal estimation under Kullback–Leibler discrepancy and squared, risk.J. of Statistical Planning and Inference136882–908.
  • [26] Hudson, H. M. (1978). A natural identity for exponential families with applications in multiparameter, estimation.The Annals of Statistics6473–484.
  • [27] Kullback, S. and Leibler, R. A. (1951). On information and, sufficiency.The Annals of Mathematical Statistics79–86.
  • [28] Landsman, Z. and Nešlehová, J. (2008). Stein’s Lemma for elliptical random, vectors.J. of Multivariate Analysis99912–927.
  • [29] Lehmann, E. (1983). Theory of point, estimation.Wiley publication.
  • [30] Li, K.-C. (1985). From Stein’s unbiased risk estimates to the method of generalized cross, validation.The Annals of Statistics131352–1377.
  • [31] Luisier, F. (2010). The SURE-LET approach to image denoising PhD thesis, École polytechnique fédérale de, lausanne.
  • [32] Luisier, F., Blu, T. and Unser, M. (2010). SURE-LET for orthonormal wavelet-domain video, denoising.IEEE Trans. on Circuits and Systems for Video Technology20913–919.
  • [33] Luisier, F., Blu, T. and Wolfe, P. J. (2012). A CURE for noisy magnetic resonance images: Chi-square unbiased risk, estimation.IEEE Trans. on Image Processing213454–3466.
  • [34] Lv, J. and Liu, J. S. (2014). Model selection principles in misspecified, models.J. of the Royal Statistical Society: Series B (Statistical Methodology)76141–167.
  • [35] Mallows, C. L. (1973). Some Comments on, Cp.Technometrics15661–675.
  • [36] Morris, C. N. (1982). Natural exponential families with quadratic variance, functions.The Annals of Statistics65–80.
  • [37] Pesquet, J.-C., Benazza-Benyahia, A. and Chaux, C. (2009). A SURE Approach for Digital Signal/Image Deconvolution, Problems.IEEE Trans. on Signal Processing574616–4632.
  • [38] Ramani, S., Blu, T. and Unser, M. (2008). Monte-Carlo SURE: a black-box optimization of regularization parameters for general denoising, algorithms.IEEE Trans. Image Process.171540–1554.
  • [39] Ramani, S., Liu, Z., Rosen, J., Nielsen, J.-F. and Fessler, J. A. (2012). Regularization parameter selection for nonlinear iterative image restoration and MRI reconstruction using GCV and SURE-based, methods.IEEE Trans. on Image Processing213659–3672.
  • [40] Raphan, M. and Simoncelli, E. P. (2007). Learning to be Bayesian without supervision., InAdvances in Neural Inf. Process. Syst. (NIPS)191145–1152. MIT Press.
  • [41] Rigollet, P. (2012). Kullback–Leibler aggregation and misspecified generalized linear, models.The Annals of Statistics40639–665.
  • [42] Schwarz, G. (1978). Estimating the dimension of a, model.The Annals of Statistics6461–464.
  • [43] Stein, C. M. (1981). Estimation of the Mean of a Multivariate Normal, Distribution.The Annals of Statistics91135–1151.
  • [44] Tibshirani, R. (1996). Regression shrinkage and selection via the, Lasso.J. of the Royal Statistical Society. Series B. Methodological58267–288.
  • [45] Vaiter, S., Deledalle, C.-A., Fadili, J., Peyré, G. and Dossal, C. (2017). The Degrees of Freedom of Partly Smooth, Regularizers.Annals of the Institute of Statistical Mathematics69791–832.
  • [46] Van De Ville, D. and Kocher, M. (2009). SURE-Based Non-Local, Means.IEEE Signal Process. Lett.16973–976.
  • [47] Van De Ville, D. and Kocher, M. (2011). Non-local means with dimensionality reduction and SURE-based parameter, selection.IEEE Trans. Image Process.92683–2690.
  • [48] Yanagimoto, T. (1994). The Kullback-Leibler risk of the Stein estimator and the conditional, MLE.Annals of the Institute of Statistical Mathematics4629–41.