## The Annals of Statistics

### QUADRO: A supervised dimension reduction method via Rayleigh quotient optimization

#### Abstract

We propose a novel Rayleigh quotient based sparse quadratic dimension reduction method—named QUADRO ( Quadratic Dimension Reduction via Rayleigh Optimization)—for analyzing high-dimensional data. Unlike in the linear setting where Rayleigh quotient optimization coincides with classification, these two problems are very different under nonlinear settings. In this paper, we clarify this difference and show that Rayleigh quotient optimization may be of independent scientific interests. One major challenge of Rayleigh quotient optimization is that the variance of quadratic statistics involves all fourth cross-moments of predictors, which are infeasible to compute for high-dimensional applications and may accumulate too many stochastic errors. This issue is resolved by considering a family of elliptical models. Moreover, for heavy-tail distributions, robust estimates of mean vectors and covariance matrices are employed to guarantee uniform convergence in estimating nonpolynomially many parameters, even though only the fourth moments are assumed. Methodologically, QUADRO is based on elliptical models which allow us to formulate the Rayleigh quotient maximization as a convex optimization problem. Computationally, we propose an efficient linearized augmented Lagrangian method to solve the constrained optimization problem. Theoretically, we provide explicit rates of convergence in terms of Rayleigh quotient under both Gaussian and general elliptical models. Thorough numerical results on both synthetic and real datasets are also provided to back up our theoretical results.

#### Article information

Source
Ann. Statist. Volume 43, Number 4 (2015), 1498-1534.

Dates
Revised: December 2014
First available in Project Euclid: 17 June 2015

https://projecteuclid.org/euclid.aos/1434546213

Digital Object Identifier
doi:10.1214/14-AOS1307

Mathematical Reviews number (MathSciNet)
MR3357869

Zentralblatt MATH identifier
1317.62054

#### Citation

Fan, Jianqing; Ke, Zheng Tracy; Liu, Han; Xia, Lucy. QUADRO: A supervised dimension reduction method via Rayleigh quotient optimization. Ann. Statist. 43 (2015), no. 4, 1498--1534. doi:10.1214/14-AOS1307. https://projecteuclid.org/euclid.aos/1434546213

#### References

• Bickel, P. J., Ritov, Y. and Tsybakov, A. B. (2009). Simultaneous analysis of lasso and Dantzig selector. Ann. Statist. 37 1705–1732.
• Bindea, G., Mlecnik, B., Hackl, H., Charoentong, P., Tosolini, M., Kirilovsky, A., Fridman, W.-H., Pagès, F., Trajanoski, Z. and Galon, J. (2009). ClueGO: A cytoscape plug-in to decipher functionally grouped gene ontology and pathway annotation networks. Bioinformatics 25 1091–1093.
• Cai, T. and Liu, W. (2011). A direct estimation approach to sparse linear discriminant analysis. J. Amer. Statist. Assoc. 106 1566–1577.
• Cai, T., Liu, W. and Luo, X. (2011). A constrained $\ell_{1}$ minimization approach to sparse precision matrix estimation. J. Amer. Statist. Assoc. 106 594–607.
• Catoni, O. (2012). Challenging the empirical mean and empirical variance: A deviation study. Ann. Inst. Henri Poincaré Probab. Stat. 48 1148–1185.
• Chen, X., Zou, C. and Cook, R. D. (2010). Coordinate-independent sparse sufficient dimension reduction and variable selection. Ann. Statist. 38 3696–3723.
• Cook, R. D. and Weisberg, S. (1991). Comment on “Sliced inverse regression for dimension reduction.” J. Amer. Statist. Assoc. 86 328–332.
• Coudret, R., Liquet, B. and Saracco, J. (2014). Comparison of sliced inverse regression approaches for underdetermined cases. J. SFdS 155 72–96.
• Fan, J. and Fan, Y. (2008). High-dimensional classification using features annealed independence rules. Ann. Statist. 36 2605–2637.
• Fan, J., Feng, Y. and Tong, X. (2012). A road to classification in high dimensional space. J. Roy. Statist. Soc. B 74 745–771.
• Fan, J. and Li, R. (2001). Variable selection via nonconcave penalized likelihood and its oracle properties. J. Amer. Statist. Assoc. 96 1348–1360.
• Fan, J., Xue, L. and Zou, H. (2014). Strong oracle optimality of folded concave penalized estimation. Ann. Statist. 42 819–849.
• Fan, J., Ke, Z. T., Liu, H. and Xia, L. (2015). Supplement to “QUADRO: A supervised dimension reduction method via Rayleigh quotient optimization.” DOI:10.1214/14-AOS1307SUPP.
• Feller, W. (1966). An Introduction to Probability Theory and Its Applications. Vol. II. Wiley, New York.
• Fisher, R. A. (1936). The use of multiple measurements in taxonomic problems. Annals of Eugenics 7 179–188.
• Friedman, J. H. (1989). Regularized discriminant analysis. J. Amer. Statist. Assoc. 84 165–175.
• Guo, Y., Hastie, T. and Tibshirani, R. (2005). Regularized discriminant analysis and its application in microarrays. Biostatistics 1 1–18.
• Han, F. and Liu, H. (2012). Transelliptical component analysis. Adv. Neural Inf. Process. Syst. 25 368–376.
• Han, F., Zhao, T. and Liu, H. (2013). CODA: High dimensional copula discriminant analysis. J. Mach. Learn. Res. 14 629–671.
• Jiang, B. and Liu, J. S. (2013). Sliced inverse regression with variable selection and interaction detection. Preprint. Available at arXiv:1304.4056.
• Kendall, M. G. (1938). A new measure of rank correlation. Biometrika 30 81–93.
• Kent, J. T. (1991). Discussion of Li (1991). J. Amer. Statist. Assoc. 86 336–337.
• Li, K.-C. (1991). Sliced inverse regression for dimension reduction. J. Amer. Statist. Assoc. 86 316–342.
• Li, K.-C. (2000). High dimensional data analysis via the SIR/PHD approach. Lecture notes, Dept. Statistics, UCLA, Los Angeles, CA. Available at http://www.stat.ucla.edu/~kcli/sir-PHD.pdf.
• Li, B. and Wang, S. (2007). On directional regression for dimension reduction. J. Amer. Statist. Assoc. 102 997–1008.
• Li, L. and Yin, X. (2008). Sliced inverse regression with regularizations. Biometrics 64 124–131.
• Liu, H., Han, F., Yuan, M., Lafferty, J. and Wasserman, L. (2012). High-dimensional semiparametric Gaussian copula graphical models. Ann. Statist. 40 2293–2326.
• Luparello, C. (2013). Aspects of collagen changes in breast cancer. J. Carcinogene Mutagene S13:007. DOI:10.4172/2157-2518.S13-007.
• Maruyama, Y. and Seo, T. (2003). Estimation of moment parameter in elliptical distributions. J. Japan Statist. Soc. 33 215–229.
• Shao, J., Wang, Y., Deng, X. and Wang, S. (2011). Sparse linear discriminant analysis by thresholding for high dimensional data. Ann. Statist. 39 1241–1265.
• Wei, Z. and Li, H. (2007). A Markov random field model for network-based analysis of genomic data. Bioinformatics 23 1537–1544.
• Witten, D. M. and Tibshirani, R. (2011). Penalized classification using Fisher’s linear discriminant. J. R. Stat. Soc. Ser. B. Stat. Methodol. 73 753–772.
• Wu, H.-M. (2008). Kernel sliced inverse regression with applications to classification. J. Comput. Graph. Statist. 17 590–610.
• Zhao, T., Roeder, K. and Liu, H. (2013). Positive semidefinite rank-based correlation matrix estimation with application to semiparametric graph estimation. Unpublished manuscript.
• Zhao, P. and Yu, B. (2006). On model selection consistency of Lasso. J. Mach. Learn. Res. 7 2541–2563.
• Zhong, W., Zeng, P., Ma, P., Liu, J. S. and Zhu, Y. (2005). RSIR: Regularized sliced inverse regression for motif discovery. Bioinformatics 21 4169–4175.
• Zou, H. (2006). The adaptive lasso and its oracle properties. J. Amer. Statist. Assoc. 101 1418–1429.
• Zou, H. and Li, R. (2008). One-step sparse estimates in nonconcave penalized likelihood models. Ann. Statist. 36 1509–1533.

#### Supplemental materials

• Supplement to “QUADRO: A supervised dimension reduction method via Rayleigh quotient optimization”. Owing to space constraints, numerical tables for simulation and some of the technical proofs are relegated to a supplementary document. It contains proofs of Propositions 2.1, 5.1 and 6.2.