Electronic Journal of Statistics
- Electron. J. Statist.
- Volume 4 (2010), 932-949.
MAP model selection in Gaussian regression
Felix Abramovich and Vadim Grinshtein
Abstract
We consider a Bayesian approach to model selection in Gaussian linear regression, where the number of predictors might be much larger than the number of observations. From a frequentist view, the proposed procedure results in the penalized least squares estimation with a complexity penalty associated with a prior on the model size. We investigate the optimality properties of the resulting model selector. We establish the oracle inequality and specify conditions on the prior that imply its asymptotic minimaxity within a wide range of sparse and dense settings for “nearly-orthogonal” and “multicollinear” designs.
Article information
Source
Electron. J. Statist. Volume 4 (2010), 932-949.
Dates
First available in Project Euclid: 24 September 2010
Permanent link to this document
http://projecteuclid.org/euclid.ejs/1285333752
Digital Object Identifier
doi:10.1214/10-EJS573
Mathematical Reviews number (MathSciNet)
MR2721039
Zentralblatt MATH identifier
1329.62051
Subjects
Primary: 62C99: None of the above, but in this section
Secondary: 62C10, 62C20, 62G05
Keywords
Adaptivity complexity penalty Gaussian linear regression maximum a posteriori rule minimax estimation model selection oracle inequality sparsity
Citation
Abramovich, Felix; Grinshtein, Vadim. MAP model selection in Gaussian regression. Electron. J. Statist. 4 (2010), 932--949. doi:10.1214/10-EJS573. http://projecteuclid.org/euclid.ejs/1285333752.

