Institute of Mathematical Statistics Lecture Notes - Monograph Series

Characterizations of joint distributions, copulas, information, dependence and decoupling, with applications to time series

Victor H. de la Peña, Rustam Ibragimov, and Shaturgun Sharakhmetov

Full-text: Open access

Abstract

In this paper, we obtain general representations for the joint distributions and copulas of arbitrary dependent random variables absolutely continuous with respect to the product of given one-dimensional marginal distributions. The characterizations obtained in the paper represent joint distributions of dependent random variables and their copulas as sums of U-statistics in independent random variables. We show that similar results also hold for expectations of arbitrary statistics in dependent random variables. As a corollary of the results, we obtain new representations for multivariate divergence measures as well as complete characterizations of important classes of dependent random variables that give, in particular, methods for constructing new copulas and modeling different dependence structures.

The results obtained in the paper provide a device for reducing the analysis of convergence in distribution of a sum of a double array of dependent random variables to the study of weak convergence for a double array of their independent copies. Weak convergence in the dependent case is implied by similar asymptotic results under independence together with convergence to zero of one of a series of dependence measures including the multivariate extension of Pearson's correlation, the relative entropy or other multivariate divergence measures. A closely related result involves conditions for convergence in distribution of $m$-dimensional statistics $h(X_t, X_{t+1}, \ldots, X_{t+m-1})$ of time series $\{X_t\}$ in terms of weak convergence of $h(\xi_t, \xi_{t+1},\ldots, \xi_{t+m-1})$, where $\{\xi_t\}$ is a sequence of independent copies of $X_t'$s, and convergence to zero of measures of intertemporal dependence in $\{X_t\}$. The tools used include new sharp estimates for the distance between the distribution function of an arbitrary statistic in dependent random variables and the distribution function of the statistic in independent copies of the random variables in terms of the measures of dependence of the random variables. Furthermore, we obtain new sharp complete decoupling moment and probability inequalities for dependent random variables in terms of their dependence characteristics.

Chapter information

Source
Javier Rojo, ed., Optimality: The Second Erich L. Lehmann Symposium (Beachwood, Ohio, USA: Institute of Mathematical Statistics, 2006), 183-209

Dates
First available in Project Euclid: 28 November 2007

Permanent link to this document
https://projecteuclid.org/euclid.lnms/1196283961

Digital Object Identifier
doi:10.1214/074921706000000455

Subjects
Primary: 62E10: Characterization and structure theory 62H05: Characterization and structure theory 62H20: Measures of association (correlation, canonical correlation, etc.)
Secondary: 60E05: Distributions: general theory 62B10: Information-theoretic topics [See also 94A17] 62F12: Asymptotic properties of estimators 62G20: Asymptotic properties

Keywords
joint distribution copulas information dependence decoupling convergence relative entropy Kullback--Leibler and Shannon mutual information Pearson coefficient Hellinger distance divergence measures

Rights
Copyright © 2006, Institute of Mathematical Statistics

Citation

de la Peña, Victor H.; Ibragimov, Rustam; Sharakhmetov, Shaturgun. Characterizations of joint distributions, copulas, information, dependence and decoupling, with applications to time series. Optimality, 183--209, Institute of Mathematical Statistics, Beachwood, Ohio, USA, 2006. doi:10.1214/074921706000000455. https://projecteuclid.org/euclid.lnms/1196283961


Export citation