Electronic Journal of Statistics

A deconvolution path for mixtures

Oscar-Hernan Madrid-Padilla, Nicholas G. Polson, and James Scott

Full-text: Open access

Abstract

We propose a class of estimators for deconvolution in mixture models based on a simple two-step “bin-and-smooth” procedure applied to histogram counts. The method is both statistically and computationally efficient: by exploiting recent advances in convex optimization, we are able to provide a full deconvolution path that shows the estimate for the mi-xing distribution across a range of plausible degrees of smoothness, at far less cost than a full Bayesian analysis. This enables practitioners to conduct a sensitivity analysis with minimal effort. This is especially important for applied data analysis, given the ill-posed nature of the deconvolution problem. Our results establish the favorable theoretical properties of our estimator and show that it offers state-of-the-art performance when compared to benchmark methods across a range of scenarios.

Article information

Source
Electron. J. Statist., Volume 12, Number 1 (2018), 1717-1751.

Dates
Received: May 2017
First available in Project Euclid: 29 May 2018

Permanent link to this document
https://projecteuclid.org/euclid.ejs/1527559246

Digital Object Identifier
doi:10.1214/18-EJS1430

Mathematical Reviews number (MathSciNet)
MR3806437

Zentralblatt MATH identifier
06886382

Subjects
Primary: 62G05: Estimation
Secondary: 62G07: Density estimation

Keywords
Deconvolution mixture models penalized likelihood empirical Bayes sensitivity analysis

Rights
Creative Commons Attribution 4.0 International License.

Citation

Madrid-Padilla, Oscar-Hernan; Polson, Nicholas G.; Scott, James. A deconvolution path for mixtures. Electron. J. Statist. 12 (2018), no. 1, 1717--1751. doi:10.1214/18-EJS1430. https://projecteuclid.org/euclid.ejs/1527559246


Export citation

References

  • S. Boyd, N. Parikh, E. Chu, B. Peleato, and J. Eckstein. Distributed optimization and statistical learning via the alternating direction method of multipliers., Foundations and Trends® in Machine Learning, 3(1):1–122, 2011.
  • L. D. Brown and E. Greenshtein. Nonparametric empirical bayes and compound decision approaches to estimation of a high-dimensional vector of normal means., The Annals of Statistics, pages 1685–1704, 2009.
  • R. Carroll, A. Delaigle, and P. Hall. Deconvolution when classifying noisy data involving transformations., Journal of the American Statistical Association, 107(499) :1166–1177, 2012.
  • R. J. Carroll and P. Hall. Optimal rates of convergence for deconvolving a density., Journal of the American Statistical Association, 83(404) :1184–1186, 1988.
  • A. Delaigle. Nonparametric kernel methods with errors-in-variables: Constructing estimators, computing them, and avoiding common mistakes., Australian & New Zealand Journal of Statistics, 56(2):105–124, 2014.
  • A. Delaigle and I. Gijbels. Estimation of integrated squared density derivatives from a contaminated sample., Journal of the Royal Statistical Society: Series B (Statistical Methodology), 64(4):869–886, 2002.
  • A. Delaigle and P. Hall. Parametrically assisted nonparametric estimation of a density in the deconvolution problem., Journal of the American Statistical Association, 109(506):717–729, 2014.
  • K.-A. Do, P. Muller, and F. Tang. A Bayesian mixture model for differential gene expression., Journal of the Royal Statistical Society, Series C, 54(3):627–44, 2005.
  • S. Donnet, V. Rivoirard, J. Rousseau, and C. Scricciolo. Posterior concentration rates for empirical bayes procedures, with applications to dirichlet process mixtures., arXiv preprint arXiv :1406.4406, 2014.
  • B. Efron. Tweedie’s formula and selection bias., Journal of the American Statistical Association, 106(496) :1602–14, 2011.
  • B. Efron. Empirical bayes deconvolution estimates., Biometrika, 103(1):1–20, 2016.
  • M. D. Escobar and M. West. Bayesian density estimation and inference using mixtures., Journal of the American Statistical Association, 90:577–88, 1995.
  • J. Fan. On the optimal rates of convergence for nonparametric deconvolution problems., The Annals of Statistics, pages 1257–1272, 1991.
  • J. Fan and J.-Y. Koo. Wavelet deconvolution., Information Theory, IEEE Transactions on, 48(3):734–747, 2002.
  • T. S. Ferguson. A Bayesian analysis of some nonparametric problems., The Annals of Statistics, 1:209–30, 1973.
  • S. Geman and C.-R. Hwang. Nonparametric maximum likelihood estimation by the method of sieves., The Annals of Statistics, 10(2):401–14, 1982.
  • S. Ghosal and A. W. Van Der Vaart. Entropies and rates of convergence for maximum likelihood and bayes estimation for mixtures of normal densities., The Annals of Statistics, pages 1233–1263, 2001.
  • I. J. Good and R. A. Gaskins. Nonparametric roughness penalties for probability densities., Biometrika, 58(2):255–77, 1971.
  • P. Hall, A. Meister, et al. A ridge-parameter approach to deconvolution., The Annals of Statistics, 35(4) :1535–1558, 2007.
  • H. Ishwaran and M. Zarepour. Exact and approximate sum representations for the dirichlet process., The Canadian Journal of Statistics/La Revue Canadienne de Statistique, pages 269–283, 2002.
  • W. Jiang and C.-H. Zhang. General maximum likelihood empirical bayes estimation of normal means., The Annals of Statistics, 37(4) :1647–1684, 2009.
  • N. A. Johnson. A dynamic programming algorithm for the fused lasso and l 0-segmentation., Journal of Computational and Graphical Statistics, 22(2):246–260, 2013.
  • J. Kiefer and J. Wolfowitz. Consistency of the maximum likelihood estimator in the presence of infinitely many incidental parameters., The Annals of Mathematical Statistics, 27:887–906, 1956.
  • R. Koenker. Rebayes: empirical bayes estimation and inference in r., R package version 0.41, 2013.
  • R. Koenker and I. Mizera. Convex optimization, shape constraints, compound decisions, and empirical bayes rules., Journal of the American Statistical Association, 109(506):674–685, 2014.
  • M. Lee, P. Hall, H. Shen, J. S. Marron, J. Tolle, and C. Burch. Deconvolution estimation of mixture distributions with boundaries., Electronic journal of statistics, 7:323, 2013.
  • R. Martin and S. T. Tokdar. Semiparametric inference in mixture models with predictive recursion marginal likelihood., Biometrika, 98(3):567–582, 2011.
  • R. Martin and S. T. Tokdar. A nonparametric empirical Bayes framework for large-scale multiple testing., Biostatistics, 13(3):427–39, 2012.
  • O. Muralidharan. An empirical bayes mixture method for effect size and false discovery rate estimation., The Annals of Applied Statistics, pages 422–438, 2010.
  • M. A. Newton. On a nonparametric recursive estimator of the mixing distribution., Sankhyā: The Indian Journal of Statistics, Series A, pages 306–322, 2002.
  • O. H. M. Padilla and J. G. Scott. Nonparametric density estimation by histogram trend filtering., arXiv preprint arXiv :1509.04348, 2015.
  • A. Ramdas and R. J. Tibshirani. Fast and flexible ADMM algorithms for trend filtering. Technical report, Carnegie Mellon University, http://www.stat.cmu.edu/$\sim$ryantibs/papers/fasttf.pdf, 2014.
  • L. Rudin, S. Osher, and E. Faterni. Nonlinear total variation based noise removal algorithms., Physica D: Nonlinear Phenomena, 60(259–68), 1992.
  • A. Sarkar, B. K. Mallick, J. Staudenmayer, D. Pati, and R. J. Carroll. Bayesian semiparametric density deconvolution in the presence of conditionally heteroscedastic measurement errors., Journal of Computational and Graphical Statistics, 23(4) :1101–1125, 2014a.
  • A. Sarkar, D. Pati, B. K. Mallick, and R. J. Carroll. Bayesian semiparametric multivariate density deconvolution., arXiv preprint arXiv :1404.6462, 2014b.
  • B. W. Silverman. On the estimation of a probability density function by the maximum penalized likelihood method., The Annals of Statistics, pages 795–810, 1982.
  • D. Singh, P. G. Febbo, K. Ross, D. G. Jackson, J. Manola, C. Ladd, P. Tamayo, A. A. Renshaw, A. V. D’Amico, J. P. Richie, E. S. Lander, M. Loda, P. W. Kantoff, T. R. Golub, and W. R. Sellers. Gene expression correlates of clinical prostate cancer behavior., Cancer Cell, 1(2):203–9, 2002.
  • J. Staudenmayer, D. Ruppert, and J. P. Buonaccorsi. Density estimation in the presence of heteroscedastic measurement error., Journal of the American Statistical Association, 103(482):726–736, 2008.
  • L. A. Stefanski and R. J. Carroll. Deconvolving kernel density estimators., Statistics, 21(2):169–184, 1990.
  • W. Tansey, O. Koyejo, R. A. Poldrack, and J. G. Scott. False discovery rate smoothing. Technical report, University of Texas at Austin, 2014., http://arxiv.org/abs/1411.6144.
  • R. Tibshirani, M. Saunders, S. Rosset, J. Zhu, and K. Knight. Sparsity and smoothness via the fused lasso., Journal of the Royal Statistical Society (Series B), 67:91–108, 2005.
  • R. J. Tibshirani. Adaptive piecewise polynomial estimation via trend filtering., The Annals of Statistics, 42(1):285–323, 2014.
  • R. J. Tibshirani and J. Taylor. Degrees of freedom in lasso problems., The Annals of Statistics, 40(2) :1198–1232, 2012.
  • S. T. Tokdar, R. Martin, and J. K. Ghosh. Consistency of a recursive estimate of mixing distributions., The Annals of Statistics, pages 2502–2522, 2009.
  • S. Wager. A geometric approach to density estimation with additive noise., Statistica Sinica, 2013.
  • A. Wald. Note on the consistency of the maximum likelihood estimate., The Annals of Mathematical Statistics, pages 595–601, 1949.
  • C.-H. Zhang. Fourier methods for estimating mixing densities and distributions., The Annals of Statistics, pages 806–831, 1990.