Abstract
We study in this paper computational and statistical boundaries for submatrix localization. Given one observation of (one or multiple nonoverlapping) signal submatrix (of magnitude $\lambda$ and size $k_{m}\times k_{n}$) embedded in a large noise matrix (of size $m\times n$), the goal is to optimal identify the support of the signal submatrix computationally and statistically.
Two transition thresholds for the signal-to-noise ratio $\lambda/\sigma$ are established in terms of $m$, $n$, $k_{m}$ and $k_{n}$. The first threshold, $\sf SNR_{c}$, corresponds to the computational boundary. We introduce a new linear time spectral algorithm that identifies the submatrix with high probability when the signal strength is above the threshold $\sf SNR_{c}$. Below this threshold, it is shown that no polynomial time algorithm can succeed in identifying the submatrix, under the hidden clique hypothesis. The second threshold, $\sf SNR_{s}$, captures the statistical boundary, below which no method can succeed in localization with probability going to one in the minimax sense. The exhaustive search method successfully finds the submatrix above this threshold. In marked contrast to submatrix detection and sparse PCA, the results show an interesting phenomenon that $\sf SNR_{c}$ is always significantly larger than $\sf SNR_{s}$ under the sub-Gaussian error model, which implies an essential gap between statistical optimality and computational efficiency for submatrix localization.
Citation
T. Tony Cai. Tengyuan Liang. Alexander Rakhlin. "Computational and statistical boundaries for submatrix localization in a large noisy matrix." Ann. Statist. 45 (4) 1403 - 1430, August 2017. https://doi.org/10.1214/16-AOS1488
Information