Electronic Journal of Statistics
- Electron. J. Statist.
- Volume 10, Number 1 (2016), 121-170.
Randomized maximum-contrast selection: Subagging for large-scale regression
We introduce a general method for variable selection in a large-scale regression setting where both the number of parameters and the number of samples are extremely large. The proposed method is based on careful combination of penalized estimators, each applied to a random projection of the sample space into a low-dimensional space. In one special case that we study in detail, the random projections are divided into non-overlapping blocks, each consisting of only a small portion of the original data. Within each block we select the projection yielding the smallest out-of-sample error. Our random ensemble estimator then aggregates the results according to a new maximal-contrast voting scheme to determine the final selected set. Our theoretical results illustrate the effect on performance of increasing the number of non-overlapping blocks. Moreover, we demonstrate that statistical optimality is retained along with the computational speedup. The proposed method achieves minimax rates for approximate recovery over all estimators, using the full set of samples. Furthermore, our theoretical results allow the number of subsamples to grow with the subsample size and do not require irrepresentable condition. The estimator is also compared empirically with several other popular high-dimensional estimators via an extensive simulation study, which reveals its excellent finite-sample performance.
Electron. J. Statist. Volume 10, Number 1 (2016), 121-170.
Received: July 2015
First available in Project Euclid: 17 February 2016
Permanent link to this document
Digital Object Identifier
Mathematical Reviews number (MathSciNet)
Zentralblatt MATH identifier
Bradic, Jelena. Randomized maximum-contrast selection: Subagging for large-scale regression. Electron. J. Statist. 10 (2016), no. 1, 121--170. doi:10.1214/15-EJS1085. https://projecteuclid.org/euclid.ejs/1455715959.