Open Access
2019 False discovery rate control via debiased lasso
Adel Javanmard, Hamid Javadi
Electron. J. Statist. 13(1): 1212-1253 (2019). DOI: 10.1214/19-EJS1554
Abstract

We consider the problem of variable selection in high-dimensional statistical models where the goal is to report a set of variables, out of many predictors $X_{1},\dotsc ,X_{p}$, that are relevant to a response of interest. For linear high-dimensional model, where the number of parameters exceeds the number of samples $(p>n)$, we propose a procedure for variables selection and prove that it controls the directional false discovery rate (FDR) below a pre-assigned significance level $q\in [0,1]$. We further analyze the statistical power of our framework and show that for designs with subgaussian rows and a common precision matrix $\Omega \in{\mathbb{R}} ^{p\times p}$, if the minimum nonzero parameter $\theta_{\min }$ satisfies \[\sqrt{n}\theta_{\min }-\sigma \sqrt{2(\max_{i\in [p]}\Omega_{ii})\log \left(\frac{2p}{qs_{0}}\right)}\to \infty \,,\] then this procedure achieves asymptotic power one.

Our framework is built upon the debiasing approach and assumes the standard condition $s_{0}=o(\sqrt{n}/(\log p)^{2})$, where $s_{0}$ indicates the number of true positives among the $p$ features. Notably, this framework achieves exact directional FDR control without any assumption on the amplitude of unknown regression parameters, and does not require any knowledge of the distribution of covariates or the noise level. We test our method in synthetic and real data experiments to assess its performance and to corroborate our theoretical results.

References

1.

[B+42] Z. W. Birnbaum et al. An inequality for mill’s ratio. The Annals of Mathematical Statistics, 13(2):245–246, 1942.[B+42] Z. W. Birnbaum et al. An inequality for mill’s ratio. The Annals of Mathematical Statistics, 13(2):245–246, 1942.

2.

[BC13] A. Belloni and V. Chernozhukov. Least squares after model selection in high-dimensional sparse models., Bernoulli, 19(2):521–547, 2013. 06168762 10.3150/11-BEJ410 euclid.bj/1363192037[BC13] A. Belloni and V. Chernozhukov. Least squares after model selection in high-dimensional sparse models., Bernoulli, 19(2):521–547, 2013. 06168762 10.3150/11-BEJ410 euclid.bj/1363192037

3.

[BC15] R. F. Barber and E. J. Candès. Controlling the false discovery rate via knockoffs., The Annals of Statistics, 43(5) :2055–2085, 2015. 1327.62082 10.1214/15-AOS1337 euclid.aos/1438606853[BC15] R. F. Barber and E. J. Candès. Controlling the false discovery rate via knockoffs., The Annals of Statistics, 43(5) :2055–2085, 2015. 1327.62082 10.1214/15-AOS1337 euclid.aos/1438606853

4.

[BC16] R. F. Barber and E. J. Candes. A knockoff filter for high-dimensional selective inference., arXiv :1602.03574, 2016.[BC16] R. F. Barber and E. J. Candes. A knockoff filter for high-dimensional selective inference., arXiv :1602.03574, 2016.

5.

[BCC+18] A. Belloni, V. Chernozhukov, D. Chetverikov, C. Hansen, and K. Kato. High-dimensional econometrics and generalized gmm. arXiv preprint arXiv :1806.01888, 2018.[BCC+18] A. Belloni, V. Chernozhukov, D. Chetverikov, C. Hansen, and K. Kato. High-dimensional econometrics and generalized gmm. arXiv preprint arXiv :1806.01888, 2018.

6.

[BCH14] A. Belloni, V. Chernozhukov, and C. Hansen. Inference on treatment effects after selection among high-dimensional controls., The Review of Economic Studies, 81(2):608–650, 2014. 1409.62142 10.1093/restud/rdt044[BCH14] A. Belloni, V. Chernozhukov, and C. Hansen. Inference on treatment effects after selection among high-dimensional controls., The Review of Economic Studies, 81(2):608–650, 2014. 1409.62142 10.1093/restud/rdt044

7.

[BCS18] R. F. Barber, E. J. Candès, and R. J. Samworth. Robust inference with knockoffs., arXiv preprint arXiv :1801.03896, 2018.[BCS18] R. F. Barber, E. J. Candès, and R. J. Samworth. Robust inference with knockoffs., arXiv preprint arXiv :1801.03896, 2018.

8.

[BH95] Y. Benjamini and Y. Hochberg. Controlling the false discovery rate: a practical and powerful approach to multiple testing., Journal of the Royal Statistical Society. Series B (Methodological), pages 289–300, 1995. 0809.62014 10.1111/j.2517-6161.1995.tb02031.x[BH95] Y. Benjamini and Y. Hochberg. Controlling the false discovery rate: a practical and powerful approach to multiple testing., Journal of the Royal Statistical Society. Series B (Methodological), pages 289–300, 1995. 0809.62014 10.1111/j.2517-6161.1995.tb02031.x

9.

[BRT09] P. J. Bickel, Y. Ritov, and A. B. Tsybakov. Simultaneous analysis of Lasso and Dantzig selector., American Journal of Mathematics, 37 :1705–1732, 2009. 1173.62022 10.1214/08-AOS620 euclid.aos/1245332830[BRT09] P. J. Bickel, Y. Ritov, and A. B. Tsybakov. Simultaneous analysis of Lasso and Dantzig selector., American Journal of Mathematics, 37 :1705–1732, 2009. 1173.62022 10.1214/08-AOS620 euclid.aos/1245332830

10.

[Büh12] P. Bühlmann. Statistical significance in high-dimensional linear models. arXiv :1202.1377, 2012.[Büh12] P. Bühlmann. Statistical significance in high-dimensional linear models. arXiv :1202.1377, 2012.

11.

[BvdG11] P. Bühlmann and S. Van de Geer., Statistics for high-dimensional data. Springer-Verlag, 2011.[BvdG11] P. Bühlmann and S. Van de Geer., Statistics for high-dimensional data. Springer-Verlag, 2011.

12.

[BY01] Y. Benjamini and D. Yekutieli. The control of the false discovery rate in multiple testing under dependency., The Annals of Statistics, pages 1165–1188, 2001. 1041.62061 10.1214/aos/1013699998 euclid.aos/1013699998[BY01] Y. Benjamini and D. Yekutieli. The control of the false discovery rate in multiple testing under dependency., The Annals of Statistics, pages 1165–1188, 2001. 1041.62061 10.1214/aos/1013699998 euclid.aos/1013699998

13.

[CFJL18] E. Candes, Y. Fan, L. Janson, and J. Lv. Panning for gold: Model-x knockoffs for high dimensional controlled variable selection., Journal of the Royal Statistical Society: Series B (Statistical Methodology), 2018. 1398.62335 10.1111/rssb.12265[CFJL18] E. Candes, Y. Fan, L. Janson, and J. Lv. Panning for gold: Model-x knockoffs for high dimensional controlled variable selection., Journal of the Royal Statistical Society: Series B (Statistical Methodology), 2018. 1398.62335 10.1111/rssb.12265

14.

[CP09] E. Candès and Y. Plan. Near-ideal model selection by $\ell_1$ minimization., The Annals of Statistics, 37(5A) :2145–2177, 2009.[CP09] E. Candès and Y. Plan. Near-ideal model selection by $\ell_1$ minimization., The Annals of Statistics, 37(5A) :2145–2177, 2009.

15.

[CT05] E. J. Candés and T. Tao. Decoding by linear programming., IEEE Transactions on Information Theory, 51 :4203–4215, 2005. 1264.94121 10.1109/TIT.2005.858979[CT05] E. J. Candés and T. Tao. Decoding by linear programming., IEEE Transactions on Information Theory, 51 :4203–4215, 2005. 1264.94121 10.1109/TIT.2005.858979

16.

[FDLL17] Y. Fan, E. Demirkaya, G. Li, and J. Lv. Rank: large-scale inference with graphical nonlinear knockoffs., arXiv preprint arXiv :1709.00092, 2017.[FDLL17] Y. Fan, E. Demirkaya, G. Li, and J. Lv. Rank: large-scale inference with graphical nonlinear knockoffs., arXiv preprint arXiv :1709.00092, 2017.

17.

[FGH12] J. Fan, S. Guo, and N. Hao. Variance estimation using refitted cross-validation in ultrahigh dimensional regression., Journal of the Royal Statistical Society: Series B (Statistical Methodology), 74(1) :1467–9868, 2012. 1411.62199 10.1111/j.1467-9868.2011.01005.x[FGH12] J. Fan, S. Guo, and N. Hao. Variance estimation using refitted cross-validation in ultrahigh dimensional regression., Journal of the Royal Statistical Society: Series B (Statistical Methodology), 74(1) :1467–9868, 2012. 1411.62199 10.1111/j.1467-9868.2011.01005.x

18.

[FHG12] J. Fan, X. Han, and W. Gu. Control of the false discovery rate under arbitrary covariance dependence (with discussion)., Journal of American Statistical Association, 107 :1019–1045, 2012.[FHG12] J. Fan, X. Han, and W. Gu. Control of the false discovery rate under arbitrary covariance dependence (with discussion)., Journal of American Statistical Association, 107 :1019–1045, 2012.

19.

[FL01] J. Fan and R. Li. Variable selection via nonconcave penalized likelihood and its oracle properties., Journal of the American Statistical Association, 96(456) :1348–1360, 2001. 1073.62547 10.1198/016214501753382273[FL01] J. Fan and R. Li. Variable selection via nonconcave penalized likelihood and its oracle properties., Journal of the American Statistical Association, 96(456) :1348–1360, 2001. 1073.62547 10.1198/016214501753382273

20.

[FL08] J. Fan and J. Lv. Sure independence screening for ultrahigh dimensional feature space., Journal of the Royal Statistical Society: Series B (Statistical Methodology), 70(5):849–911, 2008. 1411.62187 10.1111/j.1467-9868.2008.00674.x[FL08] J. Fan and J. Lv. Sure independence screening for ultrahigh dimensional feature space., Journal of the Royal Statistical Society: Series B (Statistical Methodology), 70(5):849–911, 2008. 1411.62187 10.1111/j.1467-9868.2008.00674.x

21.

[GLN02] C. R. Genovese, N. A. Lazar, and T. Nichols. Thresholding of statistical maps in functional neuroimaging using the false discovery rate., Neuroimage, 15(4):870–878, 2002.[GLN02] C. R. Genovese, N. A. Lazar, and T. Nichols. Thresholding of statistical maps in functional neuroimaging using the false discovery rate., Neuroimage, 15(4):870–878, 2002.

22.

[GR04] E. Greenshtein and Y. Ritov. Persistence in high-dimensional predictor selection and the virtue of over-parametrization., Bernoulli, 10:971–988, 2004. 1055.62078 10.3150/bj/1106314846 euclid.bj/1106314846[GR04] E. Greenshtein and Y. Ritov. Persistence in high-dimensional predictor selection and the virtue of over-parametrization., Bernoulli, 10:971–988, 2004. 1055.62078 10.3150/bj/1106314846 euclid.bj/1106314846

23.

[GT00] A. Gelman and F. Tuerlinckx. Type s error rates for classical and bayesian single and multiple comparison procedures., Computational Statistics, 15(3):373–390, 2000. 1037.62015 10.1007/s001800000040[GT00] A. Gelman and F. Tuerlinckx. Type s error rates for classical and bayesian single and multiple comparison procedures., Computational Statistics, 15(3):373–390, 2000. 1037.62015 10.1007/s001800000040

24.

[JM13a] A. Javanmard and A. Montanari. Model selection for high-dimensional regression under the generalized irrepresentability condition. In, Advances in Neural Information Processing Systems, pages 3012–3020, 2013.[JM13a] A. Javanmard and A. Montanari. Model selection for high-dimensional regression under the generalized irrepresentability condition. In, Advances in Neural Information Processing Systems, pages 3012–3020, 2013.

25.

[JM13b] A. Javanmard and A. Montanari. Nearly optimal sample size in hypothesis testing for high-dimensional regression. In, 2013 51st Annual Allerton Conference on Communication, Control, and Computing (Allerton), pages 1427–1434. IEEE, 2013.[JM13b] A. Javanmard and A. Montanari. Nearly optimal sample size in hypothesis testing for high-dimensional regression. In, 2013 51st Annual Allerton Conference on Communication, Control, and Computing (Allerton), pages 1427–1434. IEEE, 2013.

26.

[JM14a] A. Javanmard and A. Montanari. Confidence intervals and hypothesis testing for high-dimensional regression., The Journal of Machine Learning Research, 15(1) :2869–2909, 2014. 1319.62145[JM14a] A. Javanmard and A. Montanari. Confidence intervals and hypothesis testing for high-dimensional regression., The Journal of Machine Learning Research, 15(1) :2869–2909, 2014. 1319.62145

27.

[JM14b] A. Javanmard and A. Montanari. Hypothesis Testing in High-Dimensional Regression under the Gaussian Random Design Model: Asymptotic Theory., IEEE Transactions on Information Theory, 60(10) :6522–6554, 2014. 1360.62074 10.1109/TIT.2014.2343629[JM14b] A. Javanmard and A. Montanari. Hypothesis Testing in High-Dimensional Regression under the Gaussian Random Design Model: Asymptotic Theory., IEEE Transactions on Information Theory, 60(10) :6522–6554, 2014. 1360.62074 10.1109/TIT.2014.2343629

28.

[JM18] A. Javanmard and A. Montanari. Debiasing the lasso: Optimal sample size for gaussian designs., The Annals of Statistics, 46(6A) :2593–2622, 2018. 1407.62270 10.1214/17-AOS1630 euclid.aos/1536307227[JM18] A. Javanmard and A. Montanari. Debiasing the lasso: Optimal sample size for gaussian designs., The Annals of Statistics, 46(6A) :2593–2622, 2018. 1407.62270 10.1214/17-AOS1630 euclid.aos/1536307227

29.

[KF00] K. Knight and W. Fu. Asymptotics for lasso-type estimators., The Annals of Statistics, pages 1356–1378, 2000. 1105.62357 10.1214/aos/1015957397 euclid.aos/1015957397[KF00] K. Knight and W. Fu. Asymptotics for lasso-type estimators., The Annals of Statistics, pages 1356–1378, 2000. 1105.62357 10.1214/aos/1015957397 euclid.aos/1015957397

30.

[Liu13] W. Liu. Gaussian graphical model estimation with false discovery rate control., The Annals of Statistics, pages 2948–2978, 2013. 1288.62094 10.1214/13-AOS1169 euclid.aos/1388545674[Liu13] W. Liu. Gaussian graphical model estimation with false discovery rate control., The Annals of Statistics, pages 2948–2978, 2013. 1288.62094 10.1214/13-AOS1169 euclid.aos/1388545674

31.

[Lou08] K. Lounici. Sup-norm convergence rate and sign concentration property of lasso and dantzig estimators., Electronic Journal of Statistics, 2:90–102, 2008. 1306.62155 10.1214/08-EJS177[Lou08] K. Lounici. Sup-norm convergence rate and sign concentration property of lasso and dantzig estimators., Electronic Journal of Statistics, 2:90–102, 2008. 1306.62155 10.1214/08-EJS177

32.

[LS+14] W. Liu, Q.-M. Shao, et al. Phase transition and regularized bootstrap in large-scale $t$-tests with false discovery rate control. The Annals of Statistics, 42(5) :2003–2025, 2014. 1305.62213 10.1214/14-AOS1249 euclid.aos/1410440632[LS+14] W. Liu, Q.-M. Shao, et al. Phase transition and regularized bootstrap in large-scale $t$-tests with false discovery rate control. The Annals of Statistics, 42(5) :2003–2025, 2014. 1305.62213 10.1214/14-AOS1249 euclid.aos/1410440632

33.

[MB06] N. Meinshausen and P. Bühlmann. High-dimensional graphs and variable selection with the lasso., The Annals of Statistics, 34 :1436–1462, 2006.[MB06] N. Meinshausen and P. Bühlmann. High-dimensional graphs and variable selection with the lasso., The Annals of Statistics, 34 :1436–1462, 2006.

34.

[Owe05] A. B. Owen. Variance of the number of false discoveries., Journal of the Royal Statistical Society: Series B (Statistical Methodology), 67(3):411–426, 2005. 1069.62102 10.1111/j.1467-9868.2005.00509.x[Owe05] A. B. Owen. Variance of the number of false discoveries., Journal of the Royal Statistical Society: Series B (Statistical Methodology), 67(3):411–426, 2005. 1069.62102 10.1111/j.1467-9868.2005.00509.x

35.

[RFZ+05] S.-Y. Rhee, W. J. Fessel, A. R. Zolopa, L. Hurley, T. Liu, J. Taylor, D. P. Nguyen, S. Slome, D. Klein, M. Horberg, et al. Hiv-1 protease and reverse-transcriptase mutations: correlations with antiretroviral therapy in subtype b isolates and implications for drug-resistance surveillance. The Journal of Infectious Diseases, 192(3):456–465, 2005.[RFZ+05] S.-Y. Rhee, W. J. Fessel, A. R. Zolopa, L. Hurley, T. Liu, J. Taylor, D. P. Nguyen, S. Slome, D. Klein, M. Horberg, et al. Hiv-1 protease and reverse-transcriptase mutations: correlations with antiretroviral therapy in subtype b isolates and implications for drug-resistance surveillance. The Journal of Infectious Diseases, 192(3):456–465, 2005.

36.

[RTF16] S. Reid, R. Tibshirani, and J. Friedman. A study of error variance estimation in lasso regression., Statistica Sinica, pages 35–67, 2016. 1372.62023[RTF16] S. Reid, R. Tibshirani, and J. Friedman. A study of error variance estimation in lasso regression., Statistica Sinica, pages 35–67, 2016. 1372.62023

37.

[RTW+06] S.-Y. Rhee, J. Taylor, G. Wadhera, A. Ben-Hur, D. L. Brutlag, and R. W. Shafer. Genotypic predictors of human immunodeficiency virus type 1 drug resistance. Proceedings of the National Academy of Sciences, 103(46) :17355–17360, 2006.[RTW+06] S.-Y. Rhee, J. Taylor, G. Wadhera, A. Ben-Hur, D. L. Brutlag, and R. W. Shafer. Genotypic predictors of human immunodeficiency virus type 1 drug resistance. Proceedings of the National Academy of Sciences, 103(46) :17355–17360, 2006.

38.

[RYB03] A. Reiner, D. Yekutieli, and Y. Benjamini. Identifying differentially expressed genes using false discovery rate controlling procedures., Bioinformatics, 19(3):368–375, 2003.[RYB03] A. Reiner, D. Yekutieli, and Y. Benjamini. Identifying differentially expressed genes using false discovery rate controlling procedures., Bioinformatics, 19(3):368–375, 2003.

39.

[RZ11] M. Rudelson and S. Zhou. Reconstruction from anisotropic random measurements., 2011.[RZ11] M. Rudelson and S. Zhou. Reconstruction from anisotropic random measurements., 2011.

40.

[SBvdG10] N. Städler, P. Bühlmann, and S. Van de Geer. $\ell_1$-penalization for Mixture Regression Models (with discussion)., Test, 19:209–285, 2010.[SBvdG10] N. Städler, P. Bühlmann, and S. Van de Geer. $\ell_1$-penalization for Mixture Regression Models (with discussion)., Test, 19:209–285, 2010.

41.

[SRC+15] W. Sun, B. Reich, T. Cai, M. Guindani, and A. Schwartzman. False discovery control in large-scale spatial multiple testing. Journal of the Royal Statistical Society, Series B, 77:59–83, 2015. 07066221 10.1111/rssb.12064[SRC+15] W. Sun, B. Reich, T. Cai, M. Guindani, and A. Schwartzman. False discovery control in large-scale spatial multiple testing. Journal of the Royal Statistical Society, Series B, 77:59–83, 2015. 07066221 10.1111/rssb.12064

42.

[SZ12] T. Sun and C.-H. Zhang. Scaled sparse linear regression., Biometrika, 99(4):879–898, 2012. 06111558 10.1093/biomet/ass043[SZ12] T. Sun and C.-H. Zhang. Scaled sparse linear regression., Biometrika, 99(4):879–898, 2012. 06111558 10.1093/biomet/ass043

43.

[Tib96] R. Tibshirani. Regression shrinkage and selection with the Lasso., Journal of the Royal Statistical Society: Series B, 58:267–288, 1996. 0850.62538 10.1111/j.2517-6161.1996.tb02080.x[Tib96] R. Tibshirani. Regression shrinkage and selection with the Lasso., Journal of the Royal Statistical Society: Series B, 58:267–288, 1996. 0850.62538 10.1111/j.2517-6161.1996.tb02080.x

44.

[Tuk91] J. W. Tukey. The philosophy of multiple comparisons., Statistical Science, pages 100–116, 1991.[Tuk91] J. W. Tukey. The philosophy of multiple comparisons., Statistical Science, pages 100–116, 1991.

45.

[vdGB09] S. Van de Geer and P. Bühlmann. On the conditions used to prove oracle results for the lasso., Electronic Journal of Statistics, 3 :1360–1392, 2009. 1327.62425 10.1214/09-EJS506[vdGB09] S. Van de Geer and P. Bühlmann. On the conditions used to prove oracle results for the lasso., Electronic Journal of Statistics, 3 :1360–1392, 2009. 1327.62425 10.1214/09-EJS506

46.

[VdGBRD14] S. Van de Geer, P. Bühlmann, Y. Ritov, and R. Dezeure. On asymptotically optimal confidence regions and tests for high-dimensional models., The Annals of Statistics, 42(3) :1166–1202, 2014. 1305.62259 10.1214/14-AOS1221 euclid.aos/1403276911[VdGBRD14] S. Van de Geer, P. Bühlmann, Y. Ritov, and R. Dezeure. On asymptotically optimal confidence regions and tests for high-dimensional models., The Annals of Statistics, 42(3) :1166–1202, 2014. 1305.62259 10.1214/14-AOS1221 euclid.aos/1403276911

47.

[VdV00] A. W. Van der Vaart., Asymptotic statistics, volume 3. Cambridge university press, 2000. MR1652247[VdV00] A. W. Van der Vaart., Asymptotic statistics, volume 3. Cambridge university press, 2000. MR1652247

48.

[Wai09] M. Wainwright. Sharp thresholds for high-dimensional and noisy sparsity recovery using $\ell_1$-constrained quadratic programming., IEEE Transactions on Information Theory, 55 :2183–2202, 2009. 1367.62220 10.1109/TIT.2009.2016018[Wai09] M. Wainwright. Sharp thresholds for high-dimensional and noisy sparsity recovery using $\ell_1$-constrained quadratic programming., IEEE Transactions on Information Theory, 55 :2183–2202, 2009. 1367.62220 10.1109/TIT.2009.2016018

49.

[Wu08] W. B. Wu. On false discovery control under dependence., The Annals of Statistics, 36:364–380, 2008.[Wu08] W. B. Wu. On false discovery control under dependence., The Annals of Statistics, 36:364–380, 2008.

50.

[XCML11] J. Xie, T. T. Cai, J. Maris, and H. Li. Optimal false discovery rate control for dependent data., Statistics and Its Interface, 4(4):417–430, 2011. 1245.62091 10.4310/SII.2011.v4.n4.a1[XCML11] J. Xie, T. T. Cai, J. Maris, and H. Li. Optimal false discovery rate control for dependent data., Statistics and Its Interface, 4(4):417–430, 2011. 1245.62091 10.4310/SII.2011.v4.n4.a1

51.

[Zha10] C.-H. Zhang. Nearly unbiased variable selection under minimax concave penalty., The Annals of Statistics, 38(2):894–942, 2010. 1183.62120 10.1214/09-AOS729 euclid.aos/1266586618[Zha10] C.-H. Zhang. Nearly unbiased variable selection under minimax concave penalty., The Annals of Statistics, 38(2):894–942, 2010. 1183.62120 10.1214/09-AOS729 euclid.aos/1266586618

52.

[ZY06] P. Zhao and B. Yu. On model selection consistency of Lasso., The Journal of Machine Learning Research, 7 :2541–2563, 2006. 1222.62008[ZY06] P. Zhao and B. Yu. On model selection consistency of Lasso., The Journal of Machine Learning Research, 7 :2541–2563, 2006. 1222.62008

53.

[ZZ11] C.-H. Zhang and S. Zhang. Confidence Intervals for Low-Dimensional Parameters in High-Dimensional Linear Models. arXiv :1110.2563, 2011.[ZZ11] C.-H. Zhang and S. Zhang. Confidence Intervals for Low-Dimensional Parameters in High-Dimensional Linear Models. arXiv :1110.2563, 2011.

54.

[ZZ14] C.-H. Zhang and S. S. Zhang. Confidence intervals for low dimensional parameters in high dimensional linear models., Journal of the Royal Statistical Society: Series B (Statistical Methodology), 76(1):217–242, 2014. MR3153940 1411.62196 10.1111/rssb.12026[ZZ14] C.-H. Zhang and S. S. Zhang. Confidence intervals for low dimensional parameters in high dimensional linear models., Journal of the Royal Statistical Society: Series B (Statistical Methodology), 76(1):217–242, 2014. MR3153940 1411.62196 10.1111/rssb.12026
Adel Javanmard and Hamid Javadi "False discovery rate control via debiased lasso," Electronic Journal of Statistics 13(1), 1212-1253, (2019). https://doi.org/10.1214/19-EJS1554
Received: 1 October 2018; Published: 2019
Vol.13 • No. 1 • 2019
Back to Top