## The Annals of Statistics

### Variable selection for general index models via sliced inverse regression

#### Abstract

Variable selection, also known as feature selection in machine learning, plays an important role in modeling high dimensional data and is key to data-driven scientific discoveries. We consider here the problem of detecting influential variables under the general index model, in which the response is dependent of predictors through an unknown function of one or more linear combinations of them. Instead of building a predictive model of the response given combinations of predictors, we model the conditional distribution of predictors given the response. This inverse modeling perspective motivates us to propose a stepwise procedure based on likelihood-ratio tests, which is effective and computationally efficient in identifying important variables without specifying a parametric relationship between predictors and the response. For example, the proposed procedure is able to detect variables with pairwise, three-way or even higher-order interactions among $p$ predictors with a computational time of $O(p)$ instead of $O(p^{k})$ (with $k$ being the highest order of interactions). Its excellent empirical performance in comparison with existing methods is demonstrated through simulation studies as well as real data examples. Consistency of the variable selection procedure when both the number of predictors and the sample size go to infinity is established.

#### Article information

Source
Ann. Statist., Volume 42, Number 5 (2014), 1751-1786.

Dates
First available in Project Euclid: 11 September 2014

https://projecteuclid.org/euclid.aos/1410440624

Digital Object Identifier
doi:10.1214/14-AOS1233

Mathematical Reviews number (MathSciNet)
MR3262467

Zentralblatt MATH identifier
1305.62234

#### Citation

Jiang, Bo; Liu, Jun S. Variable selection for general index models via sliced inverse regression. Ann. Statist. 42 (2014), no. 5, 1751--1786. doi:10.1214/14-AOS1233. https://projecteuclid.org/euclid.aos/1410440624

#### References

• Bien, J., Taylor, J. and Tibshirani, R. (2013). A LASSO for hierarchical interactions. Ann. Statist. 41 1111–1141.
• Chen, C.-H. and Li, K.-C. (1998). Can SIR be as popular as multiple linear regression? Statist. Sinica 8 289–316.
• Chen, X., Xu, H., Yuan, P., Fang, F., Huss, M., Vega, V. B., Wong, E., Orlov, Y. L., Zhang, W., Jiang, J. et al. (2008). Integration of external signaling pathways with the core transcriptional network in embryonic stem cells. Cell 133 1106–1117.
• Cloonan, N., Forrest, A. R., Kolle, G., Gardiner, B. B., Faulkner, G. J., Brown, M. K., Taylor, D. F., Steptoe, A. L., Wani, S., Bethel, G. et al. (2008). Stem cell transcriptome profiling via massive-scale mRNA sequencing. Nature Methods 5 613–619.
• Cook, R. D. (2004). Testing predictor contributions in sufficient dimension reduction. Ann. Statist. 32 1062–1092.
• Cook, R. D. (2007). Fisher lecture: Dimension reduction in regression. Statist. Sci. 22 1–26.
• Efron, B., Hastie, T., Johnstone, I. and Tibshirani, R. (2004). Least angle regression. Ann. Statist. 32 407–499.
• Fan, J. and Li, R. (2001). Variable selection via nonconcave penalized likelihood and its oracle properties. J. Amer. Statist. Assoc. 96 1348–1360.
• Fan, J. and Lv, J. (2008). Sure independence screening for ultrahigh dimensional feature space. J. R. Stat. Soc. Ser. B Stat. Methodol. 70 849–911.
• Friedman, J., Hastie, T., Höfling, H. and Tibshirani, R. (2007). Pathwise coordinate optimization. Ann. Appl. Stat. 1 302–332.
• Golub, T. R., Slonim, D. K., Tamayo, P., Huard, C., Gaasenbeek, M., Mesirov, J. P., Coller, H., Loh, M. L., Downing, J. R., Caligiuri, M. A. et al. (1999). Molecular classification of cancer: Class discovery and class prediction by gene expression monitoring. Science 286 531–537.
• Jiang, B. and Liu, J. S. (2014). Supplement to “Variable selection for general index models via sliced inverse regression.” DOI:10.1214/14-AOS1233SUPP.
• Li, K.-C. (1991). Sliced inverse regression for dimension reduction. J. Amer. Statist. Assoc. 86 316–342.
• Li, L. (2007). Sparse sufficient dimension reduction. Biometrika 94 603–613.
• Li, L., Cook, R. D. and Nachtsheim, C. J. (2005). Model-free variable selection. J. R. Stat. Soc. Ser. B Stat. Methodol. 67 285–299.
• Li, R., Zhong, W. and Zhu, L. (2012). Feature screening via distance correlation learning. J. Amer. Statist. Assoc. 107 1129–1139.
• Miller, A. J. (1984). Selection of subsets of regression variables. J. Roy. Statist. Soc. Ser. A 147 389–425.
• Murphy, T. B., Dean, N. and Raftery, A. E. (2010). Variable selection and updating in model-based discriminant analysis for high dimensional data with food authenticity applications. Ann. Appl. Stat. 4 396–421.
• Ouyang, Z., Zhou, Q. and Wong, W. H. (2009). ChIP-Seq of transcription factors predicts absolute and differential gene expression in embryonic stem cells. Proc. Natl. Acad. Sci. USA 106 21521–21526.
• Ravikumar, P., Lafferty, J., Liu, H. and Wasserman, L. (2009). Sparse additive models. J. R. Stat. Soc. Ser. B Stat. Methodol. 71 1009–1030.
• Simon, N. and Tibshirani, R. (2012). A permutation approach to testing interactions in many dimensions. Preprint. Available at arXiv:1206.6519.
• Szretter, M. E. and Yohai, V. J. (2009). The sliced inverse regression algorithm as a maximum likelihood procedure. J. Statist. Plann. Inference 139 3570–3578.
• Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. J. R. Stat. Soc. Ser. B Stat. Methodol. 58 267–288.
• Tibshirani, R., Hastie, T., Narasimhan, B. and Chu, G. (2002). Diagnosis of multiple cancer types by shrunken centroids of gene expression. Proc. Natl. Acad. Sci. USA 99 6567–6572.
• Zhang, Y. and Liu, J. S. (2007). Bayesian inference of epistatic interactions in case–control studies. Nat. Genet. 39 1167–1173.
• Zhong, W., Zeng, P., Ma, P., Liu, J. S. and Zhu, Y. (2005). RSIR: Regularized sliced inverse regression for motif discovery. Bioinformatics 21 4169–4175.
• Zhong, W., Zhang, T., Zhu, Y. and Liu, J. S. (2012). Correlation pursuit: Forward stepwise variable selection for index models. J. R. Stat. Soc. Ser. B Stat. Methodol. 74 849–870.
• Zou, H. (2006). The adaptive lasso and its oracle properties. J. Amer. Statist. Assoc. 101 1418–1429.

#### Supplemental materials

• Supplementary material: Supplement to “Variable selection for general index models via sliced inverse regression”. We provide additional supporting materials that include detailed proofs and additional simulation results.