Abstract
Smooth backfitting has been proposed and proved as a powerful nonparametric estimation technique for additive regression models in various settings. Existing studies are restricted to cases with a moderate number of covariates and are not directly applicable to high dimensional settings. In this paper, we develop new kernel estimators based on the idea of smooth backfitting for high dimensional additive models. We introduce a novel penalization scheme, combining the idea of functional Lasso with the smooth backfitting technique. We investigate the theoretical properties of the functional Lasso smooth backfitting estimation. For the implementation of the proposed method, we devise a simple iterative algorithm where the iteration is defined by a truncated projection operator. The algorithm has only an additional thresholding operator over the projection-based iteration of the smooth backfitting algorithm. We further present a debiased version of the proposed estimator with implementation details, and investigate its theoretical properties for statistical inference. We demonstrate the finite sample performance of the methods via simulation and real data analysis.
Funding Statement
Eun Ryung Lee’s work was supported by the National Research Foundation of Korea (NRF) grant funded by the Korea government(MSIT) (No. 2022R1A2C1012798).
Seyoung Park’s work by the NRF grant funded by the MSIT (No. 2022R1A2C4002150).
Byeong U. Park’s research by the NRF grant funded by the MSIT (No. RS-2024-00338150).
Acknowledgments
Contact: Byeong U. Park, bupark@snu.ac.kr, Seoul National University, Seoul, South Korea. The first two authors made equal contribution in writing this article.
Citation
Eun Ryung Lee. Seyoung Park. Enno Mammen. Byeong U. Park. "Efficient functional Lasso kernel smoothing for high-dimensional additive regression." Ann. Statist. 52 (4) 1741 - 1773, August 2024. https://doi.org/10.1214/24-AOS2415
Information