Bayesian Analysis

On a Class of Objective Priors from Scoring Rules

Fabrizio Leisen, Cristiano Villa, and Stephen G. Walker

Advance publication

This article is in its final form and can be cited using the date of online publication and the DOI.

Full-text: Open access

Abstract

Objective prior distributions represent an important tool that allows one to have the advantages of using a Bayesian framework even when information about the parameters of a model is not available. The usual objective approaches work off the chosen statistical model and in the majority of cases the resulting prior is improper, which can pose limitations to a practical implementation, even when the complexity of the model is moderate. In this paper we propose to take a novel look at the construction of objective prior distributions, where the connection with a chosen sampling distribution model is removed. We explore the notion of defining objective prior distributions which allow one to have some degree of flexibility, in particular in exhibiting some desirable features, such as being proper, or log-concave, convex etc. The basic tool we use are proper scoring rules and the main result is a class of objective prior distributions that can be employed in scenarios where the usual model based priors fail, such as mixture models and model selection via Bayes factors. In addition, we show that the proposed class of priors is the result of minimising the information it contains, providing solid interpretation to the method.

Article information

Source
Bayesian Anal., Advance publication (2020), 25 pages.

Dates
First available in Project Euclid: 7 November 2019

Permanent link to this document
https://projecteuclid.org/euclid.ba/1573117290

Digital Object Identifier
doi:10.1214/19-BA1187

Keywords
calculus of variation differential entropy Euler–Lagrange equation Fisher information invariance objective Bayes proper scoring rules

Rights
Creative Commons Attribution 4.0 International License.

Citation

Leisen, Fabrizio; Villa, Cristiano; Walker, Stephen G. On a Class of Objective Priors from Scoring Rules. Bayesian Anal., advance publication, 7 November 2019. doi:10.1214/19-BA1187. https://projecteuclid.org/euclid.ba/1573117290


Export citation

References

  • Berger, J. O. (2006). “The case for objective Bayesian analysis.” Bayesian Analysis, 1, 1–17.
  • Berger, J. O., Bernardo, J. M. and Sun, D. (2009). “The formal definition of reference priors.” Annals of Statistics, 37, 905–938.
  • Berger, J. O., Bernardo, J. M. and Sun, D. (2012). “Objective priors for discrete parameter spaces.” Journal of the American Statistical Association, 107, 636–648.
  • Berger, J. O., Bernardo, J. M. and Sun, D. (2015). “Overall objective priors (with discussion)”. Bayesian Analysis, 10, 189–221.
  • Berger, J. O. and Pericchi, L. R. (1996). “The intrinsic Bayes factor for model selection and prediction.” Journal of the American Statistical Association, 91, 109–122.
  • Berger, J. O., Pericchi, L. R., and Varshavsky, J. (1998). “Bayes factors and marginal distributions in invariant situations.” Sankhya, 60, 109–122.
  • Berger, J. O. and Strawderman, W. (1993). “Choice of hierarchical priors: admissibility in estimation of normal means.” Technical report, 93-34C, Purdue University, Dept. of Statistics.
  • Bernardo, J. M. and Smith, A. F. M. (1994). Bayesian Theory. John Wiley & Sons, Inc., Hoboken, NJ, USA. doi: 10.1002/9780470316870.ch1.
  • Bobkov, S. G., Gozlan, N., Roberto, C. and Samson, P. M. (2014). “Bounds on the deficit in the logarithmic Sobolev inequality.” Journal of Functional Analysis, 267, 4110–4138.
  • Box, G. E. P. and Tiao, G. C. (1973). Bayesian Inference in Statistical Analysis. Reading, MA: Addison-Wesley.
  • Consonni, G., Forster, J. J. and La Rocca, L. (2013). “The Whetstone and the alum block: Balanced objective Bayesian comparison of nested models for discrete data.” Statistical Science, 28, 398–423.
  • Consonni, G., Fouskakis, D., Liseo, B. and Ntzoufras, I. (2018). “Prior distributions for objective Bayesian analysis.” Bayesian Analysis, 13, 627–679.
  • Dawid, A. P. (1983). “Invariant prior distributions.” In Encyclopedia of Statistical Sciences, eds. S. Kotz and N. L. Johnson, New York: John Wiley.
  • Dawid, A. P. and Musio, M. (2015). “Bayesian Model Selection based on Proper Scoring Rules (with discussion).” Bayesian Analysis, 10, 479–521.
  • Dawid, A. P., Musio, M. and Columbu, S. (2017). “A Note on Bayesian Model Selection for Discrete Data Using Proper Scoring Rules.” Statistics and Probability Letters, 129, 101–106.
  • Dey, D. K., Gelfand, A. E. and Peng, F. (1993). “Overdispersed generalized linear models.” Technical report, University of Connecticut, Dept. of Statistics.
  • Fonseca, T. C. O., Ferreira, M. A. R. and Migon, H. S. (2008). “Objective Bayesian analysis for the Student-T regression model.” Biometrika, 95, 325–333.
  • Giummolè, F., Mameli, V., Ruli, E. and Ventura, L. (2019). “Objective Bayesian inference with proper scoring rules.” TEST, 28, 728–755.
  • Grazian, C. and Robert, C. P. (2018). “Jeffreys priors for mixture estimation: properties and alternatives.” Computational Statistics and Data Analysis, 121, 149–163.
  • Gronwall, T. H. (1919). “Note on the derivatives with respect to a parameter of the solutions of a system of differential equations.” Annals of Mathematics, 20, 292–296.
  • Hartigan, J. A. (1964). “Invariant prior distributions.” Annals of Mathematical Statistics, 35, 836–845.
  • Hyvärinen, A. (2005). “Estimation of non-normalized statistical models by score matching.” Journal of Machine Learning Research, 6, 695–709.
  • Ibrahim, J. G. and Laud, P. W. (1991). “On Bayesian analysis of generalized linear models using Jeffreys’s prior.” Journal of the American Statistical Association, 86, 981–986.
  • Kass, R. E. (1990). “Data-translate likelihood and Jeffreys’s rule.” Biometrika, 77, 107–114.
  • Kass, R. E. and Wasserman, L. (1996). “The selection of prior distributions by formal rules.” Journal of the American Statistical Association, 91, 1343–1370.
  • Leisen, F., Villa, C. and Walker, S. G. (2019). “On a Class of Objective Priors from Scoring Rules. Supplementary Material.” Bayesian Analysis.
  • Jaynes, E. T. (1957). “Information theory and statistical mechanics I, II.” Physical Review, 106, 620–630; 108, 171–190.
  • Jaynes, E. T. (1968). “Prior probabilities.” IEEE Transactions on Systems Science and Cybernetics, SSC-4, 227–241.
  • Jeffreys, H. (1946). “An invariant form for the prior probability in estimation problems.” Proceedings of the Royal Society of London, Ser. A, 186, 453–461.
  • Jeffreys, H. (1961). Theory of Probability and Inference, 3rd ed., Cambridge University Press, London.
  • Johson, V. E. and Rossell, D. (2010). “On the use of non-local prior densities in Bayesian hypothesis tests.” Journal of the Royal Statistical Society, Series B, 72, 143–170.
  • Natarajan, R. and McCulloch, C. E. (1995). “A note on the existence of the posterior distribution for a class of mixed models for binomial responses.” Biometrika, 82, 639–643.
  • O’Hagan, A. (1995). “Fractional Bayes factors for model comparison.” Journal of the Royal Statistical Society, Series B, 57, 99–138.
  • Parry, M., Dawid, A. P. and Lauritzen S. (2012). “Proper local scoring rules.” Annals of Statistics, 40, 561–592.
  • Rissanen, J. (1983). “A universal prior for integers and estimation by minimum description length.” Annals of Statistics, 11, 416–431.
  • Rubio, F. J. and Liseo, B. (2014). “On the independence Jeffreys prior for skew-symmetric models.” Statistics & Probability Letters, 85, 91–97.
  • Rubio, F. J. and Steel, M. F. J. (2018). “Flexible linear mixed models with improper priors for longitudinal and survival data.” Electronic Journal of Statistics, 12, 572–598.
  • Rustagi, J. S. (1976). Variational Methods in Statistics. Academic Press.
  • Stone, M. (1972). “Strong Inconsistency from Uniform Priors.” Journal of the American Statistical Association, 71, 114–116.
  • Stone, M. and Dawid, A. (1972). “Un-Bayesian implications of improper Bayes inference in routine statistical problems.” Biometrika, 59, 369–375.
  • Syversveen, A. R. (1998). “Noninformative Bayesian priors. Interpretation and problems with construction and applications.” Technical Report.
  • Simpson, D., Rue, H., Riebler, A., Martins, T. G. and Sørbye, S. H. (2017). “Penalising model component complexity: a principled, practical approach to constructing priors.” Statistical Science, 32, 1–28.
  • Sweeting, T. J. (2008). “On predictive probability matching priors.” IMS Collections: Pushing the Limits of Contemporary Statistics: Contributions in Honor of Jayanta K. Ghosh, eds. B. Clarke and S. Ghosal. 3, 46–59.
  • Sweeting, T. J., Datta, G. S. and Ghosh, M. (2006). “Nonsubjective priors via predictive relative entropy loss.” Annals of Statistics, 34, 441–468.
  • Titterington, D., Smith, A. and Makov, U. (1985). Statistical Analysis of Finite Mixture Distributions. John Wiley, New York.
  • Vallejos, C. A. and Steel, M. J. F. (2013). On posterior propriety for the Student-$t$ linear regression model under Jeffreys priors. arxiv:1311.1454.
  • Villa, C. and Walker, S. G. (2015). “An objective approach to prior mass functions for discrete parameter spaces.” Journal of the American Statistical Association, 110, 1072–1082.
  • Welch, B. L. and Peers, H. W. (1963). “On formulae for confidence points based on integrals of weighted likelihoods.” Journal of the Royal Statistical Society, Series B, 35, 318–329.
  • Yang, R. and Chen, M. H. (1995). “Bayesian analysis for random coefficient regression models using noninformative priors.” Journal of Multivariate Analysis, 55, 283–311
  • Zellner, A. and Min, C. (1993). “Bayesian analysis model selection and prediction.” In Physics and Probability: Essays in honour of Edwin T. Jaynes, eds. W. T. Grandy, Jr. and P. W. Milonni, Cambridge, U.K.: Cambridge University Press.

Supplemental materials

  • On a Class of Objective Priors from Scoring Rules. Supplementary Material. Supplement to “On a Class of Objective Priors from Scoring Rules”. The supplementary material contains the Appendixes A, B, C and D of the paper.