## Abstract

Among various tests of independence the ones based on the sample correlation coefficient and on the $2 \times 2$ contingency tables seem to be foremost in applications. Although the first of these tests the absence of linear relation, the motivation stems from the fact that in the case where the bivariate distribution is a member of the normal family uncorrelatedness is equivalent to independence. A natural question arises, whether there exists a wider family of bivariate distributions where independence is characterized by uncorrelatedness. The answer to this question is given in a recent paper of Lehmann (1966). In the case of multivariate distributions the similar question is more involved since pairwise independence, in general, is not enough for mutual independence. In the present paper a simple generalization of the notion of uncorrelatedness is shown to characterize independence in a family of multivariate distributions which is analogous to the bivariate family considered by Lehmann (1966). When the data is available in the form of a $2 \times 2$ contingency table one might consider it as a simplified version of the data available on a pair of real random variables $(X_1, X_2)$ or that the information available to the experimenter is only in the form of occurrence or nonoccurrence of the events $X_1 \leqq a$ and $X_2 \leqq b$ where the pair $(a, b)$ is fixed. In both situations, one tests independence of the two events $\lbrack X_1 \leqq a\rbrack$ and $\lbrack X_2 \leqq b\rbrack$ although it may be desirable to test the independence of $X_1$ and $X_2$. Again, one might ask the question whether there exists a suitable family of bivariate distributions where the independence of the events of the above type characterizes the independence of the paired random variables. In the present paper such a family is given and a multivariate analogue of the same is shown to possess similar characterization of independence. In a recent paper Esary, Proschan and Walkup (1967) have introduced a notion of association which has several applications. Although disjoint from their study, the results of the present paper are in the same set-up and are supplementary. In order to give a precise summary, let $(X_1, X_2)$ be a pair of real valued random variables with finite second moments. The pair is said to be positively quadrant dependent if \begin{equation*}\tag{0.1}P\lbrack X_1 \leqq x_1, X_2 \leqq x_2\rbrack \geqq P\lbrack X_1 \leqq x_1\rbrack P\lbrack X_2 \leqq x_2\rbrack,\quad\text{for all} \quad x_1, x_2,\end{equation*} and negatively quadrant dependent if the inequality between the two sides of (0.1) is reversed. The bivariate distribution functions which satisfy the above restrictions define families $\mathscr{F}_1$ and $\mathscr{G}_1$ respectively. In a recent paper, Lehmann (1966) showed that in $\mathscr{F}_1 \mathbf{\cup} \mathscr{G}_1$, the independence is characterized by uncorrelatedness. In the same paper he also defined a subclass $\mathscr{F}_2 \subset \mathscr{F}_1$ which described the regression dependence between $X_1$ and $X_2$, and gave several applications to tests of the hypotheses of independence. Recently, Jogdeo and Patil (1967) showed that if $\mathscr{F}_2$ is parametrized suitably then the independence of $X_1, X_2$ is characterized by the independence of two events $\lbrack X_1 \leqq a\rbrack$ and $\lbrack X_2 \leqq b\rbrack$, for some $a$ and $b$, with the sole condition that the probabilities of these events be bounded away from 0 or 1. In particular, it was shown that if the dependence between $X_1$ and $X_2$ is described by a linear model \begin{equation*}\tag{0.2}X_1 = \alpha + \beta X_2 + \sigma Z,\end{equation*} where $X_2$ and $Z$ are independent, then the above characterization applies. In the present paper the parametrization is replaced by making the condition of regression dependence symmetric in both variables. In particular, the class $\mathscr{F}_3 \mathbf{\cup} \mathscr{G}_3$ discussed by Lehmann (1966) is a subclass of the one considered presently. In Section 2, the results stated above are generalized to multivariate distributions. Since it is well known that the pairwise independence is not enough for mutual independence, the conditions which characterize mutual independence take various forms and interpretations. The basic characteristic of the class of $n$-variate distributions which yields simple characterizations may be described as follows. If $A_i$ denotes the event $X_i \leqq x_i$ (or $\geqq x_i$) then $P(\mathbf{\cap} A_i)$ is either $\geqq$ or $\leqq \prod P(A_i)$ uniformly in $x_i$. For example, in the family of trivariate distributions which satisfy \begin{equation*}\tag{0.3}P\lbrack X_1 \leqq x_1, X_2 \leqq x_2, X_2 \leqq x_3\rbrack \leqq \prod^3_{i = 1}P\lbrack X_i \leqq x_i\rbrack\quad\text{for all}\quad x_1, x_2, x_3,\end{equation*} the independence is characterized by \begin{equation*}\begin{align*}\tag{0.4} EX_iX_j &= EX_iEX_j;\quad i \neq j, i = 1, 2, 3, \\ EX_1X_2X_3& = EX_1EX_2EX_3\end{align*}.\end{equation*} If the condition (0.3) is made stronger by requiring \begin{equation*}\tag{0.5}h(x_k; x_i, x_j) = P\lbrack X_i \leqq x_i, X_j \leqq x_j | X_k = x_k\rbrack,\quad i \neq j \neq k;\quad i = 1, 2, 3,\end{equation*} to be monotone in $x_k$ for every $x_i, x_j$ fixed then the independence of $X_1, X_2$ and $X_3$ is equivalent to that of the three events $\lbrack X_1 \leqq a\rbrack, \lbrack X_2 \leqq b\rbrack$ and $\lbrack X_3 \leqq c\rbrack$, for some $a, b, c$ such that the probabilities of these events are bounded away from 0 and 1.

## Citation

Kumar Jogdeo. "Characterizations of Independence in Certain Families of Bivariate and Multivariate Distributions." Ann. Math. Statist. 39 (2) 433 - 441, April, 1968. https://doi.org/10.1214/aoms/1177698407

## Information