We show that, under a sparsity scenario, the Lasso estimator and the Dantzig selector exhibit similar behavior. For both methods, we derive, in parallel, oracle inequalities for the prediction risk in the general nonparametric regression model, as well as bounds on the ℓp estimation loss for 1≤p≤2 in the linear model when the number of variables can be much larger than the sample size.
"Simultaneous analysis of Lasso and Dantzig selector." Ann. Statist. 37 (4) 1705 - 1732, August 2009. https://doi.org/10.1214/08-AOS620