Electronic Journal of Statistics

Change-point detection in high-dimensional covariance structure

Valeriy Avanesov and Nazar Buzun

Full-text: Open access

Abstract

In this paper we introduce a novel approach for an important problem of break detection. Specifically, we are interested in detection of an abrupt change in the covariance structure of a high-dimensional random process – a problem, which has applications in many areas e.g., neuroimaging and finance. The developed approach is essentially a testing procedure involving a choice of a critical level. To that end a non-standard bootstrap scheme is proposed and theoretically justified under mild assumptions. Theoretical study features a result providing guaranties for break detection. All the theoretical results are established in a high-dimensional setting (dimensionality $p\gg n$). Multiscale nature of the approach allows for a trade-off between sensitivity of break detection and localization. The approach can be naturally employed in an on-line setting. Simulation study demonstrates that the approach matches the nominal level of false alarm probability and exhibits high power, outperforming a recent approach.

Article information

Source
Electron. J. Statist., Volume 12, Number 2 (2018), 3254-3294.

Dates
Received: May 2017
First available in Project Euclid: 5 October 2018

Permanent link to this document
https://projecteuclid.org/euclid.ejs/1538705038

Digital Object Identifier
doi:10.1214/18-EJS1484

Subjects
Primary: 62M10: Time series, auto-correlation, regression, etc. [See also 91B84] 62H15: Hypothesis testing
Secondary: 91B84: Economic time series analysis [See also 62M10] 62P10: Applications to biology and medical sciences

Keywords
Multiscale bootstrap structural change critical value precision matrix

Rights
Creative Commons Attribution 4.0 International License.

Citation

Avanesov, Valeriy; Buzun, Nazar. Change-point detection in high-dimensional covariance structure. Electron. J. Statist. 12 (2018), no. 2, 3254--3294. doi:10.1214/18-EJS1484. https://projecteuclid.org/euclid.ejs/1538705038


Export citation

References

  • [1] Alexander Aue, Siegfried Hörmann, Lajos Horváth, and Matthew Reimherr. Break detection in the covariance structure of multivariate time series models., Ann. Statist., 37(6B) :4046–4087, 12 2009.
  • [2] Alexander Aue and Lajos Horváth. Structural breaks in time series., Journal of Time Series Analysis, 34(1):1–16, 2013.
  • [3] Valeriy Avanesov, Jörg Polzehl, and Karsten Tabelow. Consistency results and confidence intervals for adaptive l1-penalized estimators of the high-dimensional sparse precision matrix. Technical Report 2229, WIAS, 2016.
  • [4] Danielle S. Bassett, Nicholas F. Wymbs, Mason a. Porter, Peter J. Mucha, Jean M. Carlson, and Scott T. Grafton. Dynamic reconfiguration of human brain networks during learning., Proceedings of the National Academy of Sciences, 108(18) :7641, 2010.
  • [5] Peter Bauer and Peter Hackl. An extension of the mosum-technique for quality control. 22:1–7, 02, 1980.
  • [6] Luc Bauwens, Sébastien Laurent, and Jeroen V K Rombouts. Multivariate GARCH models: a survey., Journal of Applied Econometrics, 21(1):79–109, Jan 2006.
  • [7] Gérard Biau, Kevin Bleakley, and David M. Mason. Long signal change-point detection., Electron. J. Statist., 10(2) :2097–2123, 2016.
  • [8] Victor Chernozhukov, Denis Chetverikov, and Kengo Kato. Comparison and anti-concentration bounds for maxima of gaussian random vectors. Dec, 2013.
  • [9] Victor Chernozhukov, Denis Chetverikov, and Kengo Kato. Gaussian approximations and multiplier bootstrap for maxima of sums of high-dimensional random vectors., Ann. Statist., 41(6) :2786–2819, 12 2013.
  • [10] Victor Chernozhukov, Denis Chetverikov, and Kengo Kato. Central limit theorems and bootstrap in high dimensions. Dec, 2014.
  • [11] Haeran Cho. Change-point detection in panel data via double cusum statistic., Electron. J. Statist., 10(2) :2000–2038, 2016.
  • [12] Haeran Cho and Piotr Fryzlewicz. Multiple-change-point detection for high dimensional time series via sparsified binary segmentation., Journal of the Royal Statistical Society Series B, 77(2):475–507, 2015.
  • [13] Mihaela Şerban, Anthony Brockwell, John Lehoczky, and Sanjay Srivastava. Modelling the dynamic dependence structure in multivariate financial time series., Journal of Time Series Analysis, 28(5):763–782, 2007.
  • [14] M. Csörgö and L. Horváth., Limit theorems in change-point analysis. Wiley series in probability and statistics. J. Wiley & Sons, Chichester, New York, 1997.
  • [15] Birte Eichinger and Claudia Kirch. A mosum procedure for the estimation of multiple random change points., Bernoulli, 24(1):526–564, 02 2018.
  • [16] Robert F. Engle, Victor K. Ng, and Michael Rothschild. Asset pricing with a factor-arch covariance structure. Empirical estimates for treasury bills., Journal of Econometrics, 45(1–2):213–237, 1990.
  • [17] Jianqing Fan, Yang Feng, and Yichao Wu. Network exploration via the adaptive lasso and scad penalties., Ann. Appl. Stat., 3(2):521–541, 06 2009.
  • [18] Jianqing Fan and Li R. Variable selection via nonconcave penalized likelihood and its oracle properties., Journal of the American Statistical Association, 96 :1348–1360, 2001.
  • [19] Emily S Finn, Xilin Shen, Dustin Scheinost, Monica D Rosenberg, Jessica Huang, Marvin M Chun, Xenophon Papademetris, and R Todd Constable. Functional connectome fingerprinting: identifying individuals using patterns of brain connectivity., Nature Neuroscience, 18 :1664–1671, 2015.
  • [20] Jerome Friedman, Trevor Hastie, and Robert Tibshirani. Sparse inverse covariance estimation with the graphical lasso., Biostatistics, 9(3):432–441, July 2008.
  • [21] Jerome Friedman, Trevor Hastie, Robert Tibshirani, Trevor Hastie, and Robert Tibshirani. Sparse inverse covariance estimation with the graphical lasso, pages 1–14, 2007.
  • [22] Karl J. Friston. Functional and effective connectivity: A review., Brain Connectivity, 1(1):13–36, 2011.
  • [23] S. Holm. A simple sequentially rejective multiple test procedure., Scandinavian Journal of Statistics, 6:65–70, 1979.
  • [24] Daniel Hsu, Sham M. Kakade, and Tong Zhang. A tail inequality for quadratic forms of subgaussian random vectors., Electronic Communications in Probability, 17(0), January 2012.
  • [25] Nicholas A. James and David S. Matteson. ecp: An R package for nonparametric multiple change point analysis of multivariate data., Journal of Statistical Software, 62(7):1–25, 2014.
  • [26] Jana Janková and Sara van de Geer. Confidence intervals for high-dimensional inverse covariance estimation., Electron. J. Statist., 9(1) :1205–1229, 2015.
  • [27] Jana Janková and Sara van de Geer. Honest confidence regions and optimality in high-dimensional precision matrix estimation., TEST, 26(1):143–162, 2017.
  • [28] Moritz Jirak. Uniform change point tests in high dimension., Ann. Statist., 43(6) :2451–2483, 12 2015.
  • [29] M. Lavielle and G. Teyssière. Detection of multiple change-points in multivariate time series., Lithuanian Mathematical Journal, 46(3):287–306, 2006.
  • [30] Jun Li and Song Xi Chen. Two sample tests for high-dimensional covariance matrices., Ann. Statist., 40(2):908–940, 04 2012.
  • [31] David S. Matteson and Nicholas A. James. A nonparametric approach for multiple change point analysis of multivariate data., Journal of the American Statistical Association, 109(505):334–345, 2014.
  • [32] Nicolai Meinshausen and Peter Bühlmann. High-dimensional graphs and variable selection with the lasso., Ann. Statist., 34(3) :1436–1462, 06 2006.
  • [33] Thomas Mikosch, Søren Johansen, and Eric Zivot. Handbook of Financial Time Series., Time, 468 (1996):671–693, 2009.
  • [34] Fedor Nazarov., On the maximal perimeter of a convex set in $\mathbbR ^n$ with respect to a Gaussian measure, pages 169–187. Springer Berlin Heidelberg, Berlin, Heidelberg, 2003.
  • [35] Russell A. Poldrack, Jeanette A. Mumford, and Thomas E. Nichols., Handbook of functional MRI data analysis. Cambridge University Press, 2011.
  • [36] Sebastian Puschmann, André Brechmann, and Christiane M. Thiel. Learning-dependent plasticity in human auditory cortex during appetitive operant conditioning., Human Brain Mapping, 34(11) :2841–2851, 2013.
  • [37] Pradeep Ravikumar, Martin J. Wainwright, Garvesh Raskutti, and Bin Yu. High-dimensional covariance estimation by minimizing l1-penalized log-determinant divergence., Electron. J. Statist., 5:935–980, 2011.
  • [38] A.N. Shiryaev., Optimal Stopping Rules. Stochastic Modelling and Applied Probability. Springer Berlin Heidelberg, 2007.
  • [39] V. Spokoiny and N. Willrich. Bootstrap tuning in ordered model selection., ArXiv e-prints, July 2015.
  • [40] O. Sporns., Networks of the brain. The MIT Press, 2011.
  • [41] Yao Xie and David Siegmund. Sequential multi-sensor change-point detection., Ann. Statist., 41(2):670–692, 04 2013.
  • [42] Changliang Zou, Guosheng Yin, Long Feng, and Zhaojun Wang. Nonparametric maximum likelihood approach to multiple change-point problems., Ann. Statist., 42(3):970 –1002, 06 2014.
  • [43] H. Zou. The adaptive lasso and its oracle properties., Journal of the American Statistical Association, 101(476) :1418–1429, 2006.
  • [44] Hui Zou and Runze Li. One-step sparse estimates in nonconcave penalized likelihood models., Ann. Statist., 36(4) :1509–1533, 08 2008.