Open Access
August 2005 Empirical Bayes selection of wavelet thresholds
Iain M. Johnstone, Bernard W. Silverman
Ann. Statist. 33(4): 1700-1752 (August 2005). DOI: 10.1214/009053605000000345

Abstract

This paper explores a class of empirical Bayes methods for level-dependent threshold selection in wavelet shrinkage. The prior considered for each wavelet coefficient is a mixture of an atom of probability at zero and a heavy-tailed density. The mixing weight, or sparsity parameter, for each level of the transform is chosen by marginal maximum likelihood. If estimation is carried out using the posterior median, this is a random thresholding procedure; the estimation can also be carried out using other thresholding rules with the same threshold. Details of the calculations needed for implementing the procedure are included. In practice, the estimates are quick to compute and there is software available. Simulations on the standard model functions show excellent performance, and applications to data drawn from various fields of application are used to explore the practical performance of the approach.

By using a general result on the risk of the corresponding marginal maximum likelihood approach for a single sequence, overall bounds on the risk of the method are found subject to membership of the unknown function in one of a wide range of Besov classes, covering also the case of f of bounded variation. The rates obtained are optimal for any value of the parameter p in (0,∞], simultaneously for a wide range of loss functions, each dominating the Lq norm of the σth derivative, with σ≥0 and 0<q≤2.

Attention is paid to the distinction between sampling the unknown function within white noise and sampling at discrete points, and between placing constraints on the function itself and on the discrete wavelet transform of its sequence of values at the observation points. Results for all relevant combinations of these scenarios are obtained. In some cases a key feature of the theory is a particular boundary-corrected wavelet basis, details of which are discussed.

Overall, the approach described seems so far unique in combining the properties of fast computation, good theoretical properties and good performance in simulations and in practice. A key feature appears to be that the estimate of sparsity adapts to three different zones of estimation, first where the signal is not sparse enough for thresholding to be of benefit, second where an appropriately chosen threshold results in substantially improved estimation, and third where the signal is so sparse that the zero estimate gives the optimum accuracy rate.

Citation

Download Citation

Iain M. Johnstone. Bernard W. Silverman. "Empirical Bayes selection of wavelet thresholds." Ann. Statist. 33 (4) 1700 - 1752, August 2005. https://doi.org/10.1214/009053605000000345

Information

Published: August 2005
First available in Project Euclid: 5 August 2005

zbMATH: 1078.62005
MathSciNet: MR2166560
Digital Object Identifier: 10.1214/009053605000000345

Subjects:
Primary: 62C12 , 62G08
Secondary: 62G20 , 62H35 , 65T60

Keywords: Adaptivity , Bayesian inference , Nonparametric regression , smoothing , Sparsity

Rights: Copyright © 2005 Institute of Mathematical Statistics

Vol.33 • No. 4 • August 2005
Back to Top