Open Access
December, 1968 On the Distribution of Some Statistics Useful in the Analysis of Jointly Stationary Time Series
Grace Wahba
Ann. Math. Statist. 39(6): 1849-1862 (December, 1968). DOI: 10.1214/aoms/1177698017

Abstract

Let $\{X(t), t = \cdots -1, 0, 1, \cdots\}$ be a $P$ dimensional zero mean stationary Gaussian time series, $X(t) = \begin{pmatrix}X_1(t)\\X_2(t)\\\vdots\\X_P(t)\end{pmatrix}$ we let $R(\tau) = EX(t)X' (t + \tau)$, where $R(\tau) = \{R_{ij}(\tau), i,j = 1, 2, \cdots P\}$, and $F(\omega) = (2\pi)^{-1} \sum^\infty_{\tau=-\infty}e^{-i\omega\tau}R(\tau)$. It is assumed that $\sum^P_{i,j=1} \sum^\infty_{\tau=-\infty} |\tau| |R_{ij}(\tau)| < \infty$, and hence $F(\omega)$ exists and the elements possess bounded derivatives. It is further assumed that $F(\omega)$ is strictly positive definite, all $\omega$. Knowledge of $F(\omega)$ serves to specify the process. $F(\omega)$, and $S$, the covariance matrix of $x = \begin{pmatrix}x_1 \\ x_2\ \\ vdots\\x_P\end{pmatrix}$, a Normal $(0, S)$ random vector are known to enjoy many analogous properties. (See [7].) To cite two examples, the hypothesis that $X_i(s)$ is independent of $X_j(t)$ for $i \neq j = 1, 2, \cdots P$, any $s, t$, is equivalent to the hypothesis that $F(\omega)$ is diagonal, all $\omega$, while the hypothesis that $x_i$ is independent of $x_j$, for $i \neq j = 1, 2, \cdots P$ is equivalent to the hypothesis that $S$ is diagonal. The conditional expectation of $x_1$, given $x_2, \cdots x_P$ is \begin{equation*}E(x_1\mid x_2, \cdots x_P) = S_{12}S^{-1}_{22}\begin{pmatrix}x_2 \\ \vdots \\ x_P\end{pmatrix}, S = \bigg(\begin{array}{c|c} S_{11} & S_{12} \\ \hline S_{21} & S_{22}\end{array} \bigg)\end{equation*}. The corresponding regression problem for stationary Gaussian time series goes as follows. If \begin{equation*}E\{X_1(t)\mid X_2(s), \cdots X_P(s), s = \cdots -1, 0, 1, \cdots\} = \sum^P_{j=2} \sum^\infty_{s=-\infty} b_j(t - s)X_j(s)\end{equation*} then $B(\omega)$, defined by $B(\omega) = (B_2(\omega), \cdots B_P(\omega)), B_j(\omega) = \sum^\infty_{s=-\infty} b_j(s)e^{i\omega s}$ satisfies \begin{equation*}B(\omega) = F_{12}(\omega)F_{22}^{-1}(\omega), \quad F(\omega) = \bigg(\begin{array}{c|c}f_{11}(\omega) & F_{12}(\omega) \\ \hline F_{21}(\omega) & F_{22}(\omega)\end{array} \bigg).\end{equation*} It is interesting to ask how well these and similar analogies carry over to sampling theory and hypothesis testing. Goodman [3] gave a heuristic argument to support the conclusion that $\hat{F}_X(\omega_k)$, a suitably formed estimate of the spectral density matrix $F(\omega_k)$ has the complex Wishart distribution. The question is met here by the following results. Firstly if $\hat{F}_X(\omega_l), l = 1, 2, \cdots M$ are estimates of the spectral density matrix, each consisting of averages of $(2n + 1)$ periodograms based on a record of length $T$, with the $\omega_l$ equally spaced and $(2n + 1)M \leqq \frac{1}{2} T$, then it is possible to construct, on the same sample space as $X(t), M$ independent complex Wishart matrices $\hat{F}{\bar{\bar{X}}}(\omega_l), l = 1, 2, \cdots M$ such that $\{\hat{F}_X(\omega_l), l = 1, 2, \cdots M\}$ converge simultaneously in mean square to $\{\hat{F}_{\bar{\bar{X}}}(\omega_l), l = 1, 2,\cdots M\}$, as $n, M$ get large. Secondly, it is legitimate to use the natural analogies from multivariate analysis to test hypotheses about time series. One example is presented, as follows. The likelihood ratio test statistic for testing $S$ diagonal is $|\hat{S}|/\mathbf{\prod}^P_{i=1} \hat{s}_{ii}$ where $\hat{S} = \{\hat{s}_{ij}$ is the sample covariance matrix. The analogous statistic $\psi$ for testing $X_i(s), X_j(t)$ independent, $i,j 1 = 2, \cdots P$ from a record of length $T$ is $\psi = \prod^M_{l=1} \lbrack|\hat{F}_X(\omega_l)|/\prod^P_{i=1} \hat{f}_{ii}(\omega_l)\rbrack$ where $\hat{F}_X(\omega_l) = \{\hat{f}_{ij}(\omega_l)\}$ are the sample spectral density matrices as above. Letting ${\bar{dbar{\psi}}} = \prod^M_{l=1} \lbrack|\hat{F}_{\bar{\bar{x}}}(\omega_l)|/\prod^P_{i=1} \hat{h}_{ii}(\omega_l)\rbrack$ where $\hat{F}_{\bar{\bar{x}}}(\omega_l) = \{\hat{h}_{ij}(\omega_l)\}$ are the independent complex Wishart matrices referred to above, we show $EC_{n,M} |\log \psi - \log {\bar{\bar{\psi}}} \rightarrow 0$ for large $n, M$, where $C_{n,M}$ are chosen to make the result non-trival. The method of proof applies to any statistic which is a product over $l$ of sufficiently smooth functions of the entries of $\hat{F}_X(\omega_l)$. Applications to estimation and testing in the regression problem will appear elsewhere [8]. The distribution theory of functions of complex Wishart matrices has been well investigated by a number of authors [3] [5] [6], and hence can be easily applied here to statistics like ${\bar{\bar{\psi}}}$. The results above are shown for $P = 2$, it is clear that the proofs extended to any (fixed) finite $P$. The proofs proceed as follows, via a theorem which has somewhat more general application. For each $T$, let $X$ be the $2 \times T$ random matrix $X = \binom{X_1}{X_2} = \begin{pmatrix}X_1(1), \cdots, X_1(T)\\X_2(1), \cdots, X_2(T)\end{pmatrix}$ and let the $2T \times 2T$ covariance matrix $\Sigma$ be given by $\Sigma = \begin{pmatrix}\sum_{11} \sum_{12} \\ \sum_{21} \sum_{22}\end{pmatrix}$ where $\Sigma_{ij} = EX_i'X_j. \{\hat{F}_X(\omega_l)\}$, the sample spectral density matrices described above based on a record of length $T$, are each of the form $\hat{F}_X(\omega_l) = T^{-1}XQX'$ where $Q$ is a $T \times T$ circulant matrix with largest eigenvalue $ = T(2n + 1)^{-1} \leqq \frac{1}{2}M < <T$. We define circulant matrices $\bar{\Sigma}_{ij}$ which approximate $\Sigma_{ij}$, and a random matrix $\bar{X}$ on the sample space of $X$, $\bar{X} = \binom{\bar{X}_1}{\bar{X}_2} = \begin{pmatrix}\bar{X}_1(1), \cdots, \bar{X}_1(T)\\\bar{X}_2(1), \cdots, \bar{X}_2(T)\end{pmatrix}$ with $E\bar{X}_i'\bar{X}_j = \bar\Sigma_{ij}$. The $2T$ eigenvalues of the block circulant matrix $\bar\Sigma = \begin{pmatrix}\bar\Sigma_{11} \bar\Sigma_{12} \\ \bar\Sigma_{21} \bar\Sigma_{22}\end{pmatrix}$ will be the $2T$ eigenvalues of the $T$ matrices $\{F(2\pi j/T),j = 1, 2, \cdots T\}$. The distribution of random matrices of the form $T^{-1}\bar{X}Q\bar{X}'$ where $Q$ is any circulant matrix are relatively simple to investigate due to the fact that all circulant matrices commute, and their eigenvalues may be exhibited as simple functions of the elements. Circulant quadratic forms in random vectors with circulant covariance matrices are well known in the literature, (See [1] and references cited there). Let $\hat{F}_{X,Q} = T^{-1}XQX'$ and $\hat{F}_{\bar{X},Q} = T^{-1}\bar{X}Q\bar{X}'$ where $Q$ is now any $T \times T$ (real or complex) quadratic form with largest absolute eigenvalue $\leqq q$. The main Theorem allows the replacement of $X$ by $\bar{X}$ in the analysis, and is, that under the assumptions on $F(\omega)$ and $R(\tau)$, for any $T$, \begin{equation*}\tag{1.1} E \operatorname{tr} (\hat{F}_{X,Q} - \hat{F}_{\bar{X},Q})(\hat{F}_{X,Q} - \hat{F}_{\bar{X}, Q})^{\ast'} \leqq cq^2/T^2\end{equation*} where $c$ is a constant depending only on $F(\omega)$ and $R(\tau)$. A lemma, essentially allowing the replacement of $F(\omega)$ by a suitably chosen step-function, together with the application of (1.1) gives the results concerning the $\{\hat{F}_X(\omega_l)\}$ an $\lambda$. Since $\hat{R}(\tau)$, the sample (circularized) autocorrelation function is also of the form $T^{-1}XQX'$ with $Q$ circulant we obtain an easy corollary on the distribution of $\{\hat{R}(\tau)\}$.

Citation

Download Citation

Grace Wahba. "On the Distribution of Some Statistics Useful in the Analysis of Jointly Stationary Time Series." Ann. Math. Statist. 39 (6) 1849 - 1862, December, 1968. https://doi.org/10.1214/aoms/1177698017

Information

Published: December, 1968
First available in Project Euclid: 27 April 2007

zbMATH: 0279.62027
MathSciNet: MR237070
Digital Object Identifier: 10.1214/aoms/1177698017

Rights: Copyright © 1968 Institute of Mathematical Statistics

Vol.39 • No. 6 • December, 1968
Back to Top