Translator Disclaimer
Open Access
VOL. 49 | 2006 Characterizations of joint distributions, copulas, information, dependence and decoupling, with applications to time series
Victor H. de la Peña, Rustam Ibragimov, Shaturgun Sharakhmetov

Editor(s) Javier Rojo


In this paper, we obtain general representations for the joint distributions and copulas of arbitrary dependent random variables absolutely continuous with respect to the product of given one-dimensional marginal distributions. The characterizations obtained in the paper represent joint distributions of dependent random variables and their copulas as sums of U-statistics in independent random variables. We show that similar results also hold for expectations of arbitrary statistics in dependent random variables. As a corollary of the results, we obtain new representations for multivariate divergence measures as well as complete characterizations of important classes of dependent random variables that give, in particular, methods for constructing new copulas and modeling different dependence structures.

The results obtained in the paper provide a device for reducing the analysis of convergence in distribution of a sum of a double array of dependent random variables to the study of weak convergence for a double array of their independent copies. Weak convergence in the dependent case is implied by similar asymptotic results under independence together with convergence to zero of one of a series of dependence measures including the multivariate extension of Pearson's correlation, the relative entropy or other multivariate divergence measures. A closely related result involves conditions for convergence in distribution of $m$-dimensional statistics $h(X_t, X_{t+1}, \ldots, X_{t+m-1})$ of time series $\{X_t\}$ in terms of weak convergence of $h(\xi_t, \xi_{t+1},\ldots, \xi_{t+m-1})$, where $\{\xi_t\}$ is a sequence of independent copies of $X_t'$s, and convergence to zero of measures of intertemporal dependence in $\{X_t\}$. The tools used include new sharp estimates for the distance between the distribution function of an arbitrary statistic in dependent random variables and the distribution function of the statistic in independent copies of the random variables in terms of the measures of dependence of the random variables. Furthermore, we obtain new sharp complete decoupling moment and probability inequalities for dependent random variables in terms of their dependence characteristics.


Published: 1 January 2006
First available in Project Euclid: 28 November 2007

zbMATH: 1268.62050
MathSciNet: MR2337835

Digital Object Identifier: 10.1214/074921706000000455

Primary: 62E10 , 62H05 , 62H20
Secondary: 60E05 , 62B10 , 62F12 , 62G20

Keywords: convergence , copulas , Decoupling , Dependence , Divergence measures , Hellinger distance , Information , joint distribution , Kullback--Leibler and Shannon mutual information , Pearson coefficient , Relative entropy

Rights: Copyright © 2006, Institute of Mathematical Statistics


Back to Top