Electronic Journal of Statistics

Scalable methods for Bayesian selective inference

Snigdha Panigrahi and Jonathan Taylor

Full-text: Open access

Abstract

Modeled along the truncated approach in [20], selection-adjusted inference in a Bayesian regime is based on a selective posterior. Such a posterior is determined together by a generative model imposed on data and the selection event that enforces a truncation on the assumed law. The effective difference between the selective posterior and the usual Bayesian framework is reflected in the use of a truncated likelihood. The normalizer of the truncated law in the adjusted framework is the probability of the selection event; this typically lacks a closed form expression leading to the computational bottleneck in sampling from such a posterior. The current work provides an optimization problem that approximates the otherwise intractable selective posterior and leads to scalable methods that give valid post-selective Bayesian inference. The selection procedures are posed as data-queries that solve a randomized version of a convex learning program which have the advantage of preserving more left-over information for inference.

We propose a randomization scheme under which the approximating optimization has separable constraints that result in a partially separable objective in lower dimensions for many commonly used selective queries. We show that the proposed optimization gives a valid exponential rate of decay for the selection probability on a large deviation scale under a Gaussian randomization scheme. On the implementation side, we offer a primal-dual method to solve the optimization problem leading to an approximate posterior; this allows us to exploit the usual merits of a Bayesian machinery in both low and high dimensional regimes when the underlying signal is effectively sparse. We show that the adjusted estimates empirically demonstrate better frequentist properties in comparison to the unadjusted estimates based on the usual posterior, when applied to a wide range of constrained, convex data queries.

Article information

Source
Electron. J. Statist., Volume 12, Number 2 (2018), 2355-2400.

Dates
Received: September 2017
First available in Project Euclid: 25 July 2018

Permanent link to this document
https://projecteuclid.org/euclid.ejs/1532484333

Digital Object Identifier
doi:10.1214/18-EJS1452

Keywords
Approximate posterior Bayesian inference randomized queries selective posterior truncated likelihood

Rights
Creative Commons Attribution 4.0 International License.

Citation

Panigrahi, Snigdha; Taylor, Jonathan. Scalable methods for Bayesian selective inference. Electron. J. Statist. 12 (2018), no. 2, 2355--2400. doi:10.1214/18-EJS1452. https://projecteuclid.org/euclid.ejs/1532484333


Export citation

References

  • [1] Francois Aguet, Andrew A Brown, Stephane Castel, Joe R Davis, Pejman Mohammadi, Ayellet V Segre, Zachary Zappala, Nathan S Abell, Laure Fresard, Eric R Gamazon, et al. Local genetic effects on gene expression across 44 human tissues., bioRxiv 074450, 2016.
  • [2] Sio Iong Ao, Kevin Yip, Michael Ng, David Cheung, Pui-Yee Fong, Ian Melhado, and Pak C Sham. Clustag: hierarchical clustering and graph methods for selecting tag snps., Bioinformatics, 21(8) :1735–1736, 2004.
  • [3] Jacob Bien and Robert Tibshirani. Hierarchical clustering with prototypes via minimax linkage., Journal of the American Statistical Association, 106(495) :1075–1084, 2011.
  • [4] Kamalika Chaudhuri and Claire Monteleoni. Privacy-preserving logistic regression. In, Advances in Neural Information Processing Systems, pages 289–296, 2009.
  • [5] Kamalika Chaudhuri, Claire Monteleoni, and Anand D Sarwate. Differentially private empirical risk minimization., Journal of Machine Learning Research, 12(Mar) :1069–1109, 2011.
  • [6] GTEx Consortium et al. The genotype-tissue expression (gtex) pilot analysis: Multitissue gene regulation in humans., Science, 348 (6235):648–660, 2015.
  • [7] Amir Dembo and Ofer Zeitouni. Large deviations techniques and applications second edition., Large deviations techniques and applications, 38, 1998.
  • [8] Cynthia Dwork, Vitaly Feldman, Moritz Hardt, Toniann Pitassi, Omer Reingold, and Aaron Leon Roth. Preserving statistical validity in adaptive data analysis. In, Proceedings of the Forty-Seventh Annual ACM on Symposium on Theory of Computing, pages 117–126. ACM, 2015.
  • [9] William Fithian, Dennis Sun, and Jonathan Taylor. Optimal Inference After Model Selection., arXiv preprint arXiv:1410.2597, October 2014.
  • [10] Edward I George and Robert E McCulloch. Approaches for Bayesian variable selection., Statistica sinica, pages 339–373, 1997.
  • [11] Matthew D Hoffman, David M Blei, Chong Wang, and John William Paisley. Stochastic variational inference., Journal of Machine Learning Research, 14(1) :1303–1347, 2013.
  • [12] JT Gene Hwang and Zhigen Zhao. Empirical bayes confidence intervals for selected parameters in high-dimensional data., Journal of the American Statistical Association, 108(502):607–618, 2013.
  • [13] Jason D Lee, Dennis L Sun, Yuekai Sun, and Jonathan E Taylor. Exact post-selection inference with the lasso., The Annals of Statistics, 44(3):907–927, November 2016. URL http://projecteuclid.org/euclid.aos/1460381681.
  • [14] Joshua R Loftus and Jonathan E Taylor. A significance test for forward stepwise model selection. May 2014. URL, http://xxx.tau.ac.il/abs/1405.3920v1.
  • [15] Jelena Markovic and Jonathan Taylor. Bootstrap inference after using multiple queries for model selection., arXiv preprint arXiv:1612.07811, 2016.
  • [16] Thomas P Minka., A family of algorithms for approximate Bayesian inference. PhD thesis, Massachusetts Institute of Technology, 2001.
  • [17] Toby J Mitchell and John J Beauchamp. Bayesian variable selection in linear regression., Journal of the American Statistical Association, 83(404) :1023–1032, 1988.
  • [18] Sahand Negahban, Bin Yu, Martin J Wainwright, and Pradeep K Ravikumar. A unified framework for high-dimensional analysis of $m$-estimators with decomposable regularizers. In, Advances in Neural Information Processing Systems, pages 1348–1356, 2009.
  • [19] Halit Ongen, Alfonso Buil, Andrew Anand Brown, Emmanouil T Dermitzakis, and Olivier Delaneau. Fast and efficient qtl mapper for thousands of molecular phenotypes., Bioinformatics, 32(10) :1479–1485, 2015.
  • [20] Snigdha Panigrahi, Jonathan Taylor, and Asaf Weinstein. Bayesian post-selection inference in the linear model., arXiv preprint arXiv:1605.08824, 2016.
  • [21] Snigdha Panigrahi, Jelena Markovic, and Jonathan Taylor. An mcmc free approach to post-selective inference., arXiv preprint arXiv:1703.06154, 2017.
  • [22] Stephen Reid and Robert Tibshirani. Sparse regression and marginal testing using cluster prototypes., Biostatistics, 17(2):364–376, 2016.
  • [23] Gareth O Roberts and Richard L Tweedie. Exponential convergence of langevin distributions and their discrete approximations., Bernoulli, pages 341–363, 1996.
  • [24] Jonathan Taylor, Joshua Loftus, and Ryan Tibshirani. Tests in adaptive regression via the Kac-Rice formula., The Annals of Statistics, 44(2):743–770, August 2013. URL http://projecteuclid.org/euclid.aos/1458245734.
  • [25] Xiaoying Tian and Jonathan E. Taylor. Selective inference with a randomized response., arXiv preprint arXiv:1507.06739, July 2015.
  • [26] Xiaoying Tian, Snigdha Panigrahi, Jelena Markovic, Nan Bi, and Jonathan Taylor. Selective sampling after solving a convex problem., arXiv preprint arXiv:1609.05609, 2016.
  • [27] Ryan Tibshirani, Jonathan Taylor, Richard Lockhart, and Robert Tibshirani. Post-selection adaptive inference for Least Angle Regression and the Lasso., arXiv preprint arXiv:1401.3889, January 2014.
  • [28] Ryan J Tibshirani, Jonathan Taylor, Richard Lockhart, and Robert Tibshirani. Exact post-selection inference for sequential regression procedures., Journal of the American Statistical Association, 111(514):600–620, 2016.
  • [29] Fan Yang, Rina Foygel Barber, Prateek Jain, and John Lafferty. Selective inference for group-sparse linear models. In, Advances in Neural Information Processing Systems, pages 2469–2477, 2016.
  • [30] Daniel Yekutieli. Adjusted Bayesian inference for selected parameters., Journal of the Royal Statistical Society: Series B (Statistical Methodology), 74(3):515–541, 2012.
  • [31] Zhigen Zhao and JT Gene Hwang. Empirical bayes false coverage rate controlling confidence intervals., Journal of the Royal Statistical Society: Series B (Statistical Methodology), 74(5):871–891, 2012.
  • [32] Zhigen Zhao and Sanat K Sarkar. A Bayesian approach to constructing multiple confidence intervals of selected parameters with sparse signals., Statistica Sinica, pages 725–741, 2015.
  • [33] Hui Zou and Trevor Hastie. Regularization and variable selection via the elastic net., Journal of the Royal Statistical Society: Series B, 67(2):301–320, 2005. URL http://onlinelibrary.wiley.com/doi/10.1111/j.1467-9868.2005.00503.x/abstract.