Translator Disclaimer
2017 Sparse transition matrix estimation for high-dimensional and locally stationary vector autoregressive models
Xin Ding, Ziyi Qiu, Xiaohui Chen
Electron. J. Statist. 11(2): 3871-3902 (2017). DOI: 10.1214/17-EJS1325

Abstract

We consider the estimation of the transition matrix in the high-dimensional time-varying vector autoregression (TV-VAR) models. Our model builds on a general class of locally stationary VAR processes that evolve smoothly in time. We propose a hybridized kernel smoothing and $\ell^{1}$-regularized method to directly estimate the sequence of time-varying transition matrices. Under the sparsity assumption on the transition matrix, we establish the rate of convergence of the proposed estimator and show that the convergence rate depends on the smoothness of the locally stationary VAR processes only through the smoothness of the transition matrix function. In addition, for our estimator followed by thresholding, we prove that the false positive rate (type I error) and false negative rate (type II error) in the pattern recovery can asymptotically vanish in the presence of weak signals without assuming the minimum nonzero signal strength condition. Favorable finite sample performances over the $\ell^{2}$-penalized least-squares estimator and the unstructured maximum likelihood estimator are shown on simulated data. We also provide two real examples on estimating the dependence structures on financial stock prices and economic exchange rates datasets.

Citation

Download Citation

Xin Ding. Ziyi Qiu. Xiaohui Chen. "Sparse transition matrix estimation for high-dimensional and locally stationary vector autoregressive models." Electron. J. Statist. 11 (2) 3871 - 3902, 2017. https://doi.org/10.1214/17-EJS1325

Information

Received: 1 November 2016; Published: 2017
First available in Project Euclid: 18 October 2017

zbMATH: 06796558
MathSciNet: MR3714301
Digital Object Identifier: 10.1214/17-EJS1325

Subjects:
Primary: 62H12, 62M10
Secondary: 91B84

JOURNAL ARTICLE
32 PAGES


SHARE
Vol.11 • No. 2 • 2017
Back to Top