Open Access
November 2017 Eigen structure of a new class of covariance and inverse covariance matrices
Heather Battey
Bernoulli 23(4B): 3166-3177 (November 2017). DOI: 10.3150/16-BEJ840


There is a one to one mapping between a $p$ dimensional strictly positive definite covariance matrix $\Sigma$ and its matrix logarithm $L$. We exploit this relationship to study the structure induced on $\Sigma$ through a sparsity constraint on $L$. Consider $L$ as a random matrix generated through a basis expansion, with the support of the basis coefficients taken as a simple random sample of size $s=s^{*}$ from the index set $[p(p+1)/2]=\{1,\ldots,p(p+1)/2\}$. We find that the expected number of non-unit eigenvalues of $\Sigma$, denoted $\mathbb{E}[|\mathcal{A}|]$, is approximated with near perfect accuracy by the solution of the equation

\[\frac{4p+p(p-1)}{2(p+1)}[\log (\frac{p}{p-d})-\frac{d}{2p(p-d)}]-s^{*}=0.\] Furthermore, the corresponding eigenvectors are shown to possess only ${p-|\mathcal{A}^{c}|}$ non-zero entries. We use this result to elucidate the precise structure induced on $\Sigma$ and $\Sigma^{-1}$. We demonstrate that a positive definite symmetric matrix whose matrix logarithm is sparse is significantly less sparse in the original domain. This finding has important implications in high dimensional statistics where it is important to exploit structure in order to construct consistent estimators in non-trivial norms. An estimator exploiting the structure of the proposed class is presented.


Download Citation

Heather Battey. "Eigen structure of a new class of covariance and inverse covariance matrices." Bernoulli 23 (4B) 3166 - 3177, November 2017.


Received: 1 December 2015; Revised: 1 March 2016; Published: November 2017
First available in Project Euclid: 23 May 2017

zbMATH: 06778282
MathSciNet: MR3654802
Digital Object Identifier: 10.3150/16-BEJ840

Keywords: Covariance matrix , matrix logarithm , precision matrix , Spectral theory

Rights: Copyright © 2017 Bernoulli Society for Mathematical Statistics and Probability

Vol.23 • No. 4B • November 2017
Back to Top