Open Access
June 2017 Generalized Mahalanobis depth in point process and its application in neural coding
Shuyi Liu, Wei Wu
Ann. Appl. Stat. 11(2): 992-1010 (June 2017). DOI: 10.1214/17-AOAS1030

Abstract

In this paper, we propose to generalize the notion of depth in temporal point process observations. The new depth is defined as a weighted product of two probability terms: (1) the number of events in each process, and (2) the center-outward ranking on the event times conditioned on the number of events. In this study, we adopt the Poisson distribution for the first term and the Mahalanobis depth for the second term. We propose an efficient bootstrapping approach to estimate parameters in the defined depth. In the case of Poisson process, the observed events are order statistics where the parameters can be estimated robustly with respect to sample size. We demonstrate the use of the new depth by ranking realizations from a Poisson process. We also test the new method in classification problems using simulations as well as real neural spike train data. It is found that the new framework provides more accurate and robust classifications as compared to commonly used likelihood methods.

Citation

Download Citation

Shuyi Liu. Wei Wu. "Generalized Mahalanobis depth in point process and its application in neural coding." Ann. Appl. Stat. 11 (2) 992 - 1010, June 2017. https://doi.org/10.1214/17-AOAS1030

Information

Received: 1 December 2016; Revised: 1 February 2017; Published: June 2017
First available in Project Euclid: 20 July 2017

zbMATH: 06775901
MathSciNet: MR3693555
Digital Object Identifier: 10.1214/17-AOAS1030

Keywords: Mahalanobis depth , neural coding , point process , Poisson process , spike train

Rights: Copyright © 2017 Institute of Mathematical Statistics

Vol.11 • No. 2 • June 2017
Back to Top