Open Access
2014 High-dimensional Bayesian inference in nonparametric additive models
Zuofeng Shang, Ping Li
Electron. J. Statist. 8(2): 2804-2847 (2014). DOI: 10.1214/14-EJS963

Abstract

A fully Bayesian approach is proposed for ultrahigh-dimensional nonparametric additive models in which the number of additive components may be larger than the sample size, though ideally the true model is believed to include only a small number of components. Bayesian approaches can conduct stochastic model search and fulfill flexible parameter estimation by stochastic draws. The theory shows that the proposed model selection method has satisfactory properties. For instance, when the hyperparameter associated with the model prior is correctly specified, the true model has posterior probability approaching one as the sample size goes to infinity; when this hyperparameter is incorrectly specified, the selected model is still acceptable since asymptotically it is shown to be nested in the true model. To enhance model flexibility, two new $g$-priors are proposed and their theoretical performance is investigated. We also propose an efficient reversible jump MCMC algorithm to handle the computational issues. Several simulation examples are provided to demonstrate the advantages of our method.

Citation

Download Citation

Zuofeng Shang. Ping Li. "High-dimensional Bayesian inference in nonparametric additive models." Electron. J. Statist. 8 (2) 2804 - 2847, 2014. https://doi.org/10.1214/14-EJS963

Information

Published: 2014
First available in Project Euclid: 8 January 2015

zbMATH: 1348.62171
MathSciNet: MR3299123
Digital Object Identifier: 10.1214/14-EJS963

Subjects:
Primary: 62F25 , 62G20
Secondary: 62F12 , 62F15

Keywords: Bayesian group selection , generalized hyper-$g$ prior , generalized Zellner-Siow prior , nonparametric additive model , posterior model consistency , reversible jump MCMC , size-control prior , ultrahigh-dimensionality

Rights: Copyright © 2014 The Institute of Mathematical Statistics and the Bernoulli Society

Vol.8 • No. 2 • 2014
Back to Top