Abstract
The autoregressive moving average model is a stationary stochastic process $\{y_t\}$ satisfying $\sum^p_{k=0} \beta_ky_{t-k} = \sum^q_{g=0} \alpha_g\nu_{t-g}$, where the (unobservable) process $\{v_t\}$ consists of independently identically distributed random variables. The coefficients in this equation and the variance of $v_t$ are to be estimated from an observed sequence $y_1, \cdots, y_T$. To apply the method of maximum likelihood under normality the model is modified (i) by setting $y_0 = y_{-1} = \cdots = y_{1-p} = 0$ and $\nu_0 = v_{-1} = \cdots = v_{1-q} = 0$ and alternatively (ii) by setting $y_0 \equiv y_T, \cdots, y_{1-p} \equiv y_{T+1-p}$ and $v_0 \equiv v_T, \cdots, v_{1-q} \equiv v_{T+1-q}$; the former lead to procedures in the time domain and the latter to procedures in the frequency domain. Matrix methods are used for a unified development of the Newton-Raphson and scoring iterative procedures; most of the procedures have been obtained previously by different methods. Estimation of the covariances of the moving average part is also treated.
Citation
T. W. Anderson. "Estimation for Autoregressive Moving Average Models in the Time and Frequency Domains." Ann. Statist. 5 (5) 842 - 865, September, 1977. https://doi.org/10.1214/aos/1176343942
Information