Abstract
Objective prior distributions represent an important tool that allows one to have the advantages of using a Bayesian framework even when information about the parameters of a model is not available. The usual objective approaches work off the chosen statistical model and in the majority of cases the resulting prior is improper, which can pose limitations to a practical implementation, even when the complexity of the model is moderate. In this paper we propose to take a novel look at the construction of objective prior distributions, where the connection with a chosen sampling distribution model is removed. We explore the notion of defining objective prior distributions which allow one to have some degree of flexibility, in particular in exhibiting some desirable features, such as being proper, or log-concave, convex etc. The basic tool we use are proper scoring rules and the main result is a class of objective prior distributions that can be employed in scenarios where the usual model based priors fail, such as mixture models and model selection via Bayes factors. In addition, we show that the proposed class of priors is the result of minimising the information it contains, providing solid interpretation to the method.
Note
BA Webinar: https://www.youtube.com/watch?v=9z1yw_0xcEQ&t=387s
Citation
Fabrizio Leisen. Cristiano Villa. Stephen G. Walker. "On a Class of Objective Priors from Scoring Rules (with Discussion)." Bayesian Anal. 15 (4) 1345 - 1423, December 2020. https://doi.org/10.1214/19-BA1187
Information