Open Access
February 2020 Robust Bayesian model selection for heavy-tailed linear regression using finite mixtures
Flávio B. Gonçalves, Marcos O. Prates, Victor Hugo Lachos
Braz. J. Probab. Stat. 34(1): 51-70 (February 2020). DOI: 10.1214/18-BJPS417

Abstract

In this paper, we present a novel methodology to perform Bayesian model selection in linear models with heavy-tailed distributions. We consider a finite mixture of distributions to model a latent variable where each component of the mixture corresponds to one possible model within the symmetrical class of normal independent distributions. Naturally, the Gaussian model is one of the possibilities. This allows for a simultaneous analysis based on the posterior probability of each model. Inference is performed via Markov chain Monte Carlo—a Gibbs sampler with Metropolis–Hastings steps for a class of parameters. Simulated examples highlight the advantages of this approach compared to a segregated analysis based on arbitrarily chosen model selection criteria. Examples with real data are presented and an extension to censored linear regression is introduced and discussed.

Citation

Download Citation

Flávio B. Gonçalves. Marcos O. Prates. Victor Hugo Lachos. "Robust Bayesian model selection for heavy-tailed linear regression using finite mixtures." Braz. J. Probab. Stat. 34 (1) 51 - 70, February 2020. https://doi.org/10.1214/18-BJPS417

Information

Received: 1 September 2017; Accepted: 1 August 2018; Published: February 2020
First available in Project Euclid: 3 February 2020

zbMATH: 07200391
MathSciNet: MR4058970
Digital Object Identifier: 10.1214/18-BJPS417

Keywords: MCMC , penalised complexity priors , Scale mixtures of normal , slash , Student-t

Rights: Copyright © 2020 Brazilian Statistical Association

Vol.34 • No. 1 • February 2020
Back to Top