## Abstract

The first order Gaussian auto-regressive process $(x_t)$ may be defined by the stochastic difference equation \begin{equation*}\tag{1}x_t = \rho x_{t-1} + u_t,\end{equation*} where the $u$'s are NID(0, 1) and $\rho$ is an unknown parameter. The choice of a statistic as an estimator for $\rho$ depends on the initial conditions imposed on the difference equation (1). The so-called "circular" model is obtained by considering a sample of size $N$ and then assuming that $x_{N + 1} = x_1$. An appropriate estimator for $\rho$ in this case is the circular serial correlation coefficient \begin{equation*}\tag{2} r = \frac{\sum^N_{t = 1} x_tx_{t + 1}}{\sum^N_{t = 1} x^2_t}\quad (x_{N + 1} = x_1).\end{equation*} Leipnik [1] has derived an approximate density function \begin{equation*}\tag{3} f(t) = \frac{\Gamma\big(\frac{N + 2}{2}\big)}{\Gamma\big(\frac{N + 1}{2} \Gamma\big(\frac{1}{2}\big)} (1 - 2t\rho + \rho^2)^{-N/2}(1 - t^2)^{(N - 1)/2}\end{equation*} for the estimator $r$. Leipnik also evaluated the first two moments of this distribution. In this paper a formula is obtained which gives $E(r^k)$ as a polynomial of degree $k$ in $\rho$.

## Citation

John S. White. "Approximate Moments for the Serial Correlation Coefficient." Ann. Math. Statist. 28 (3) 798 - 802, September, 1957. https://doi.org/10.1214/aoms/1177706896

## Information