Electronic Journal of Statistics

Model selection of hierarchically structured covariates using elastic net

Wenqian Qiao, Heng Lian, and Min-ge Xie

Full-text: Open access


Hierarchically associated covariates are common in many fields, and it is often of interest to incorporate their information in statistical inference. This paper proposes a novel way to explicitly integrate the information of a given hierarchical tree of covariates in high-dimensional model selection. Specifically, a set of hierarchical scores is introduced to quantify the hierarchical positions of the terminal nodes of the given hierarchical tree, where a terminal node represents either a single covariate or a group of covariates. These scores are then used to weight the corresponding penalty terms in a model selection approach. We show that the proposed estimation approach has a hierarchical grouping property, namely, two highly correlated covariates that are close to each other in the hierarchical tree will be more likely included or excluded together in the model than those which are far away. We also prove model selection consistency of the proposed estimator both between and within groups. The theoretical results are illustrated by simulation and also a real data analysis on the Systemic Lupus Erythematosus (SLE) dataset.

Article information

Electron. J. Statist., Volume 10, Number 2 (2016), 3775-3806.

Received: September 2015
First available in Project Euclid: 6 December 2016

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

Grouping property hierarchical elastic net hierarchical covariate tree variable selection


Qiao, Wenqian; Lian, Heng; Xie, Min-ge. Model selection of hierarchically structured covariates using elastic net. Electron. J. Statist. 10 (2016), no. 2, 3775--3806. doi:10.1214/16-EJS1217. https://projecteuclid.org/euclid.ejs/1480993454

Export citation


  • Beissbarth, T. and Speed, T. P. (2004), “GOstat: find statistically overrepresented Gene Ontologies within a group of genes,”, Bioinformatics, 20, 1464–1465.
  • Bien. J., Taylor. J. and Tibshirani R., “A LASSO for hierarchical interactions”, The Annals of Statistics, 41, 1111–1141.
  • Bondell, H. and Reich, B. (2008), “Simultaneous regression shrinkage, variable selection, and supervised clustering of predictors with OSCAR”, Biometrics. 64, 115–123.
  • Breiman, L., Friedman, J., Olshen, R. and Stone, C. (1984), “Classification and Regression Trees,”, Wadsworth International Group.
  • Chaussabel, D., Quinn, C., Shen, J., Patel, P., Glaser, C., Baldwin, N., Stichweh, D., Blankenship, D., Li, L., Munagala, I. et al. (2008), “A modular analysis framework for blood genomic studies: application to systemic lupus erythematosus,”, Immunity, 29, 150–164.
  • Efron, B., Hastie, T., Johnstone, I. and Tibshirani, R. (2004), “Least angle regression,”, Annals of Statistics, 32, 407–499.
  • Fan, J. and Li, R. (2001), “Variable selection via nonconcave penalized likelihood and its oracle properties,”, Journal of the American Statistical Association, 96, 1348–1360.
  • Fan, J. and Lv, J. (2008), “Sure independence screening for ultra-high dimensional feature space,”, Journal of the Royal Statistical Society B, 70, 849–911.
  • Fan, J. and Lv, J. (2011), “Non-concave penalized likelihood with NP-Dimensionality,”, IEEE Transactions on Information Theory, 57, 5467–5484.
  • Frank, I. E. and Friedman, J. H. (1993), “A statistical view of some chemometrics regression tools,”, Technometrics, 35, 109–148.
  • Jia, J. and Yu, B. (2010), “On model selection consistency of the elastic net when $p\gg n$,”, Statistica Sinica, 20, 595–611.
  • Jenatton, R., Audibert, J. and Bach, F. (2011), “Structured variable selection with sparsity-inducing norms,”, Journal of Machine Learning Research, 12, 2777–2824.
  • Lv, J. and Fan, Y. (2009), “A unified approach to model selection and sparse recovery using regularized least squares,”, The Annals of Statistics, 37, 3498–3528.
  • Huang, J., Breheny, P., Ma, S. and Zhang, C.-H. (2010). “The Mnet method for variable selection,”, Technical report # 402, Department of Statistics and Actuarial Science, Univeristy of Iowa.
  • Huang, J., Ma, S., Li, H. and Zhang, C. (2011), “The sparse Laplacian shrinkage estimator for high-dimensional regression,”, Annals of Statistics, 2011, 2021–2046.
  • Huang, J., Ma, S., Xie, H. and Zhang, C. (2009), “A group bridge approach for variable selection,”, Biometrika, 96, 339–355.
  • Huang, J., Zhang, T. and Metaxas, D. (2011), “Learning with structured sparsity,”, Journal of Machine Learning Research, 12, 3371–3412.
  • Nei, M. (1973), “Analysis of gene diversity in subdivided populations,”, PNAS, 70, 3321–3323.
  • Obozinski, G., Wainwright, M. and Jordan, M. (2011), “Support union recovery in high-dimensional multivariate regression,”, Annals of Statistics, 39, 1–47.
  • Simon, N., Friedman, J., Hastie, T. and Tibshirani, R. (2013), “A sparse-group lasso,”, Journal of Computational and Graphical Statistics, 22, 231–245.
  • Tibshirani, R. (1996), “Regression shrinkage and selection via the lasso,”, Journal of the Royal Statistical Society: Series B, 58, 267–288.
  • Wang, S., Nan, B., Zhou, N. and Zhu, J. (2009), “Hierachically penalized Cox regression with grouped variables,”, Biometrika, 96, 307–322.
  • Yuan, M. and Lin, Y. (2006), “Model selection and estimation in regression with grouped variables,”, Journal of the Royal Statistical Society: Series B, 68, 49–67.
  • Yuan, M., Joseph, V. and Zou. H. (2009), “Structured variable selection and estimation,”, The Annals of Applied Statistics, 3, 1738–1757.
  • Zou, H. and Hastie, T. (2005), “Regularization and variable selection via the elastic net,”, Journal of the Royal Statistical Society: Series B, 67, 301–320.
  • Zhang, C. (2010), “Nearly unbiased variable selection under minimax concave penalty,”, The Annals of Statistics, 38 894–942.
  • Zhao, P., Rocha, G. and Yu, B. (2009), “The composite absolute penalties family for grouped and hierarchical variable selection,”, The Annals of Statistics, 37 3468–3497.