Statistical Science

Comment: Bayes, Oracle Bayes and Empirical Bayes

Aad van der Vaart

Full-text: Access denied (no subscription detected)

We're sorry, but we are unable to provide you with the full text of this article because we are not able to identify you as a subscriber. If you have a personal subscription to this journal, then please login. If you are already logged in, then you may need to update your profile to register your subscription. Read more about accessing full-text

Article information

Source
Statist. Sci., Volume 34, Number 2 (2019), 214-218.

Dates
First available in Project Euclid: 19 July 2019

Permanent link to this document
https://projecteuclid.org/euclid.ss/1563501635

Digital Object Identifier
doi:10.1214/19-STS707

Mathematical Reviews number (MathSciNet)
MR3983322

Citation

van der Vaart, Aad. Comment: Bayes, Oracle Bayes and Empirical Bayes. Statist. Sci. 34 (2019), no. 2, 214--218. doi:10.1214/19-STS707. https://projecteuclid.org/euclid.ss/1563501635


Export citation

References

  • [1] Antoniak, C. E. (1974). Mixtures of Dirichlet processes with applications to Bayesian nonparametric problems. Ann. Statist. 2 1152–1174.
  • [2] Blackwell, D. (1973). Discreteness of Ferguson selections. Ann. Statist. 1 356–358.
  • [3] Blei, D. M. and Jordan, M. I. (2006). Variational inference for Dirichlet process mixtures. Bayesian Anal. 1 121–143.
  • [4] Carvalho, C. M., Polson, N. G. and Scott, J. G. (2010). The horseshoe estimator for sparse signals. Biometrika 97 465–480.
  • [5] Castillo, I. (2012). A semiparametric Bernstein–von Mises theorem for Gaussian process priors. Probab. Theory Related Fields 152 53–99.
  • [6] Castillo, I. (2012). A semiparametric Bernstein–von Mises theorem for Gaussian process priors. Probab. Theory Related Fields 152 53–99.
  • [7] Castillo, I. and Mismer, R. (2018). Empirical Bayes analysis of spike and slab posterior distributions. Electron. J. Stat. 12 3953–4001.
  • [8] Castillo, I. and Rousseau, J. (2015). A Bernstein–von Mises theorem for smooth functionals in semiparametric models. Ann. Statist. 43 2353–2383.
  • [9] Castillo, I., Schmidt-Hieber, J. and van der Vaart, A. (2015). Bayesian linear regression with sparse priors. Ann. Statist. 43 1986–2018.
  • [10] Castillo, I. and van der Vaart, A. (2012). Needles and straw in a haystack: Posterior concentration for possibly sparse sequences. Ann. Statist. 40 2069–2101.
  • [11] Cereda, G. (2017). Current challenges in statistical DNA evidence evaluation. Leiden Univ.
  • [12] Cox, D. D. (1993). An analysis of Bayesian inference for nonparametric regression. Ann. Statist. 21 903–923.
  • [13] Davide, C. (2018). Statistical ‘rock star’ wins coveted international prize. Nature. Published online: 12 November 2018; DOI:10.1038/d41586-018-07395-w.
  • [14] De Blasi, P., Lijoi, A. and Prünster, I. (2013). An asymptotic analysis of a class of discrete nonparametric priors. Statist. Sinica 23 1299–1321.
  • [15] Escobar, M. D. (1994). Estimating normal means with a Dirichlet process prior. J. Amer. Statist. Assoc. 89 268–277.
  • [16] Favaro, S., Lijoi, A. and Prünster, I. (2012). Asymptotics for a Bayesian nonparametric estimator of species variety. Bernoulli 18 1267–1283.
  • [17] Ferguson, T. S. (1973). A Bayesian analysis of some nonparametric problems. Ann. Statist. 1 209–230.
  • [18] Ferguson, T. S. (1983). Bayesian density estimation by mixtures of normal distributions. In Recent Advances in Statistics 287–302. Academic Press, New York.
  • [19] Ghosal, S. and van der Vaart, A. (2007). Posterior convergence rates of Dirichlet mixtures at smooth densities. Ann. Statist. 35 697–723.
  • [20] Ghosal, S. and van der Vaart, A. (2017). Fundamentals of Nonparametric Bayesian Inference. Cambridge Series in Statistical and Probabilistic Mathematics 44. Cambridge Univ. Press, Cambridge.
  • [21] Ghosal, S. and van der Vaart, A. W. (2001). Entropies and rates of convergence for maximum likelihood and Bayes estimation for mixtures of normal densities. Ann. Statist. 29 1233–1263.
  • [22] Griffin, J. E. and Brown, P. J. (2010). Inference with normal-gamma prior distributions in regression problems. Bayesian Anal. 5 171–188.
  • [23] Ishwaran, H. and James, L. F. (2001). Gibbs sampling methods for stick-breaking priors. J. Amer. Statist. Assoc. 96 161–173.
  • [24] Ishwaran, H. and Rao, J. S. (2005). Spike and slab variable selection: Frequentist and Bayesian strategies. Ann. Statist. 33 730–773.
  • [25] James, L. F. (2008). Large sample asymptotics for the two-parameter Poisson–Dirichlet process. In Pushing the Limits of Contemporary Statistics: Contributions in Honor of Jayanta K. Ghosh. Inst. Math. Stat. (IMS) Collect. 3 187–199. IMS, Beachwood, OH.
  • [26] Jara, A. (2007). Applied Bayesian non-and semi-parametric inference using dppackage. R News 7 17–26.
  • [27] Jara, A., Hanson, T., Quintana, F., Mueller, P. and Rosner, G. (2015). Package DPpackage.
  • [28] Jiang, W. and Zhang, C.-H. (2009). General maximum likelihood empirical Bayes estimation of normal means. Ann. Statist. 37 1647–1684.
  • [29] Johnstone, I. M. and Silverman, B. W. (2004). Needles and straw in haystacks: Empirical Bayes estimates of possibly sparse sequences. Ann. Statist. 32 1594–1649.
  • [30] Kiefer, J. and Wolfowitz, J. (1956). Consistency of the maximum likelihood estimator in the presence of infinitely many incidental parameters. Ann. Math. Stat. 27 887–906.
  • [31] Kim, Y. (2006). The Bernstein–von Mises theorem for the proportional hazard model. Ann. Statist. 34 1678–1700.
  • [32] Knapik, B. T., Szabó, B. T., van der Vaart, A. W. and van Zanten, J. H. (2016). Bayes procedures for adaptive inference in inverse problems for the white noise model. Probab. Theory Related Fields 164 771–813.
  • [33] Koenker, R. and Mizera, I. (2014). Convex optimization, shape constraints, compound decisions, and empirical Bayes rules. J. Amer. Statist. Assoc. 109 674–685.
  • [34] Lijoi, A., Mena, R. H. and Prünster, I. (2007). Bayesian nonparametric estimation of the probability of discovering new species. Biometrika 94 769–786.
  • [35] Lijoi, A., Prünster, I. and Walker, S. G. (2008). Bayesian nonparametric estimators derived from conditional Gibbs structures. Ann. Appl. Probab. 18 1519–1547.
  • [36] Lindsay, B. (1995). Mixture models: Theory, geometry and applications. In NSF-CBMS Regional Conference Series in Probability and Statistics i–163. IMS, Hayward, CA.
  • [37] Lo, A. Y. (1983). Weak convergence for Dirichlet processes. Sankhyā Ser. A 45 105–111.
  • [38] McCloskey, J. W. T. (1965). A Model for the Distribution of Individuals by Species in an Environment. ProQuest LLC, Ann Arbor, MI. Thesis (Ph.D.), Michigan State Univ.
  • [39] Miller, J. W. and Harrison, M. T. (2014). Inconsistency of Pitman–Yor process mixtures for the number of components. J. Mach. Learn. Res. 15 3333–3370.
  • [40] Mitchell, T. J. and Beauchamp, J. J. (1988). Bayesian variable selection in linear regression. J. Amer. Statist. Assoc. 83 1023–1036.
  • [41] Murphy, S. A. and van der Vaart, A. W. (2000). On profile likelihood. J. Amer. Statist. Assoc. 95 449–485.
  • [42] Neal, R. M. (2000). Markov chain sampling methods for Dirichlet process mixture models. J. Comput. Graph. Statist. 9 249–265.
  • [43] Perman, M., Pitman, J. and Yor, M. (1992). Size-biased sampling of Poisson point processes and excursions. Probab. Theory Related Fields 92 21–39.
  • [44] Petrone, S., Rizzelli, S., Rousseau, J. and Scricciolo, C. (2014). Empirical Bayes methods in classical and Bayesian inference. Metron 72 201–215.
  • [45] Pfanzagl, J. (1988). Consistency of maximum likelihood estimators for certain nonparametric families, in particular: Mixtures. J. Statist. Plann. Inference 19 137–158.
  • [46] Pitman, J. (1995). Exchangeable and partially exchangeable random partitions. Probab. Theory Related Fields 102 145–158.
  • [47] Pitman, J. (1996). Some developments of the Blackwell–MacQueen urn scheme. In Statistics, Probability and Game Theory. Institute of Mathematical Statistics Lecture Notes—Monograph Series 30 245–267. IMS, Hayward, CA.
  • [48] Polson, N. G. and Scott, J. G. (2011). Shrink globally, act locally: Sparse Bayesian regularization and prediction. In Bayesian Statistics 9 (J. M. Bernardo, M. J. Bayarri, J. O. Berger, A. P. Dawid, D. Heckerman, A. F. M. Smith and M. West, eds.) 501–538. Oxford Univ. Press, Oxford.
  • [49] Ray, K. (2017). Adaptive Bernstein–von Mises theorems in Gaussian white noise. Ann. Statist. 45 2511–2536.
  • [50] Rivoirard, V. and Rousseau, J. (2012). Bernstein–von Mises theorem for linear functionals of the density. Ann. Statist. 40 1489–1523.
  • [51] Ročková, V. (2018). Bayesian estimation of sparse signals with a continuous spike-and-slab prior. Ann. Statist. 46 401–437.
  • [52] Scricciolo, C. (2014). Adaptive Bayesian density estimation in $L^{p}$-metrics with Pitman–Yor or normalized inverse-Gaussian process kernel mixtures. Bayesian Anal. 9 475–520.
  • [53] Sethuraman, J. (1994). A constructive definition of Dirichlet priors. Statist. Sinica 4 639–650.
  • [54] Shen, W., Tokdar, S. T. and Ghosal, S. (2013). Adaptive Bayesian multivariate density estimation with Dirichlet mixtures. Biometrika 100 623–640.
  • [55] Sniekers, S. and van der Vaart, A. (2015). Adaptive Bayesian credible sets in regression with a Gaussian process prior. Electron. J. Stat. 9 2475–2527.
  • [56] Szabó, B., van der Vaart, A. W. and van Zanten, J. H. (2015). Frequentist coverage of adaptive nonparametric Bayesian credible sets. Ann. Statist. 43 1391–1428.
  • [57] van der Pas, S., Szabó, B. and van der Vaart, A. (2017). Adaptive posterior contraction rates for the horseshoe. Electron. J. Stat. 11 3196–3225.
  • [58] van der Pas, S., Szabó, B. and van der Vaart, A. (2017). Uncertainty quantification for the horseshoe (with discussion). Bayesian Anal. 12 1221–1274.
  • [59] van der Vaart, A. (1991). On differentiable functionals. Ann. Statist. 19 178–204.
  • [60] van der Vaart, A. (2002). Semiparametric statistics. In Lectures on Probability Theory and Statistics (Saint-Flour, 1999). Lecture Notes in Math. 1781 331–457. Springer, Berlin.
  • [61] Walker, S. G. (2007). Sampling the Dirichlet mixture model with slices. Comm. Statist. Simulation Comput. 36 45–54.

See also