## Abstract

Let $(R_1, \cdots, R_n)$ be a random vector which takes on each of the $N!$ permutations of $(1, 2, \cdots, N)$ with equal probability, $1/N!$. Let $(a_{N1}, \cdots, a_{NN})$ and $(b_{N1}, \cdots, b_{NN})$ be two sets of real numbers given for every $N$. We will assume throughout that for no $N$ are the $a_{Ni}$ all equal or the $b_{Ni}$ all equal. We also assume that the $a_{Ni}$ and $b_{Ni}$ have been so normalized that $\sum a_{Ni} = \sum b_{Ni} = 0;\quad \sum a^2_{Ni} = N^{-1} \sum b^2_{Ni} = 1$. Unless otherwise stated, $\sum$ will mean $\sum^N_{i=1}$. Define \begin{equation*}\tag{1.1} S_N = \sum a_{Ni}b_{NR_i}.\end{equation*} Let $\Phi(x)$ be the unit normal c.d.f. In Section 2 sufficient conditions are given for $\mathrm{Pr}\{S_N < x\}$ to approach $\Phi(x)$ as $N \rightarrow \infty$. (The first two moments of $S_N$ are 0 and $N/(N - 1)$, respectively.) For every $N$, let $Y = (Y_{11}, \cdots, Y_{1N_1}, \cdots, Y_{m1}, \cdots, Y_{mN_m})$ be $N = N_1 + \cdots + N_m$ random variables which are mutually independent and independent of the $R_i$. We assume that all $Y_{ij}$ with the same first subscript are identically distributed. Define $\bar{Y} = N^{-1} \sum_{i,j}Y_{ij},\quad S^2 = N^{-1} \sum_{i,j} (Y_{ij} - \bar{Y})^2,\quad SY'_{ij} = Y_{ij} - \bar{Y}, Y'_{ij} = 0 \text{if} S = 0$. Let $Y'$ denote the vector of the $Y'_{ij}$. Let $y = (y_1, \cdots, y_N)$ denote a point in $N$-space. By $F_N(x, y)$ we mean the c.d.f. of the random variable $S_N = \sum a_{Ni}y_{NR_i}$. In Section 3 are considered sufficient conditions for convergence with probability one of the random c.d.f. $F_N(x, Y')$ to $\Phi(x)$.

## Citation

Meyer Dwass. "On the Asymptotic Normality of Some Statistics Used in Non-Parametric Tests." Ann. Math. Statist. 26 (2) 334 - 339, June, 1955. https://doi.org/10.1214/aoms/1177728552

## Information