Abstract
Several new methods have been recently proposed for performing valid inference after model selection. An older method is sample splitting: use part of the data for model selection and the rest for inference. In this paper, we revisit sample splitting combined with the bootstrap (or the Normal approximation). We show that this leads to a simple, assumption-lean approach to inference and we establish results on the accuracy of the method. In fact, we find new bounds on the accuracy of the bootstrap and the Normal approximation for general nonlinear parameters with increasing dimension which we then use to assess the accuracy of regression inference. We define new parameters that measure variable importance and that can be inferred with greater accuracy than the usual regression coefficients. Finally, we elucidate an inference-prediction trade-off: splitting increases the accuracy and robustness of inference but can decrease the accuracy of the predictions.
Citation
Alessandro Rinaldo. Larry Wasserman. Max G’Sell. "Bootstrapping and sample splitting for high-dimensional, assumption-lean inference." Ann. Statist. 47 (6) 3438 - 3469, December 2019. https://doi.org/10.1214/18-AOS1784
Information