Open Access
Translator Disclaimer
October 2017 Sparse CCA: Adaptive estimation and computational barriers
Chao Gao, Zongming Ma, Harrison H. Zhou
Ann. Statist. 45(5): 2074-2101 (October 2017). DOI: 10.1214/16-AOS1519

Abstract

Canonical correlation analysis is a classical technique for exploring the relationship between two sets of variables. It has important applications in analyzing high dimensional datasets originated from genomics, imaging and other fields. This paper considers adaptive minimax and computationally tractable estimation of leading sparse canonical coefficient vectors in high dimensions. Under a Gaussian canonical pair model, we first establish separate minimax estimation rates for canonical coefficient vectors of each set of random variables under no structural assumption on marginal covariance matrices. Second, we propose a computationally feasible estimator to attain the optimal rates adaptively under an additional sample size condition. Finally, we show that a sample size condition of this kind is needed for any randomized polynomial-time estimator to be consistent, assuming hardness of certain instances of the planted clique detection problem. As a byproduct, we obtain the first computational lower bounds for sparse PCA under the Gaussian single spiked covariance model.

Citation

Download Citation

Chao Gao. Zongming Ma. Harrison H. Zhou. "Sparse CCA: Adaptive estimation and computational barriers." Ann. Statist. 45 (5) 2074 - 2101, October 2017. https://doi.org/10.1214/16-AOS1519

Information

Received: 1 August 2015; Revised: 1 September 2016; Published: October 2017
First available in Project Euclid: 31 October 2017

zbMATH: 06821119
MathSciNet: MR3718162
Digital Object Identifier: 10.1214/16-AOS1519

Subjects:
Primary: 62H12
Secondary: 62C20

Keywords: computational complexity , Convex programming , group-Lasso , Minimax rates , planted clique , sparse CCA (SCCA) , sparse PCA (SPCA)

Rights: Copyright © 2017 Institute of Mathematical Statistics

JOURNAL ARTICLE
28 PAGES


SHARE
Vol.45 • No. 5 • October 2017
Back to Top