## Abstract

Let $I$ be a countable set with elements $i, j, k, \cdots$. Elements of $I$ are called "types." Let $T$ be the collection of all sequences $z = \{z(i):i \varepsilon I\}$ of non negative integers of which all, but at most finitely many, are 0. We shall identify elements of $T$ as real valued functions defined on $I$. The sum of two elements of $T$ is again an element of $T$. The 0 function shall be denoted by 0. The function which takes the value 1 at $i$ and 0 elsewhere shall be designated by $e_i$. Our branching process is a Markov chain $Z_0, Z_1, Z_2, \cdots$ with state space $T$ and stationary transition probabilities described as follows: The conditional distribution of $Z_{n + 1}$ given $Z_n = e_i$ is $p_i$ where $p_i$ is a probability distribution on $T$. Let $z \varepsilon T \cdot z(i_1) = n_1, \cdots, z(i_k) = n_k; z(i) = 0$ if $i$ is not one of $i_1, \cdots, i_k$. Then the conditional distribution of $Z_{n + 1}$ given $Z_n = z$ is the distribution of a sum of $n_1 + \cdots + n_k$ independent random variables, taking values in $T$, of which $n_1$ have distribution $p_{i_1}, \cdots$, and $n_k$ have distribution $p_{i_k}$. Finally, $\mathrm{Pr}\{Z_{n + 1} = 0 \mid Z_n = 0\} = 1$ completes the description of the transition probabilities of the process. Let $Z_n(i)$ be the $i$th component of $Z_n$, i.e., $Z_n(i) = z(i)$ if $Z_n = z. Z_n(i)$ represents the size of the population of type $i$ in the $n$th generation, and $\sum_i Z_n(i)$ represents the size of the total population of the $n$th generation. Let $(m_{ij})$ be the expectation matrix of the process, i.e., $m_{ij} = E\{Z_{n + 1}(j) \mid Z_n = e_i\}$. Let $m^{(n)}_{ij}$ be defined inductively by $m^{(n)}_{ij} = m_{ij},\quad m^{(n + 1)}_{ij} = \sum_k m^{(n)}_{ik} m_{kj}.$ We shall assume that all $m^{(n)}_{ij}$ are finite and the matrix $(m_{ij})$ is irreducible. For an extended real valued function $f$ defined on $I$, we define functions $Mf, fM$ by $Mf(i) = \sum_j m_{ij}f(j);\quad fM(j) = \sum_if(i)m_{ij},$ whenever the right sides of the equalities are well defined. Note that $Mf, fM$ are always well defined if $f$ is non negative since $m_{ij}$ are non negative. (We shall always adopt the following conventions concerning addition and multiplication involving $\infty$: \begin{align*}a + \infty = \infty + a = \infty\quad\text{if}\quad a > - \infty, \notag \\ a \cdot \infty = \infty \cdot a = \infty\quad\text{if}\quad a > 0, \notag \\ = 0\quad\text{if} \quad a = 0.\end{align*} Therefore, we have \begin{equation*}\tag{1}E\{Z_{k + n}(j) \mid Z_k = e_i\} = m^{(n)}_{ij}\end{equation*} and \begin{equation*}\tag{2}E\{Z_{k + n}(j) \mid Z_k = z\} = zM^n(j).\end{equation*} If $f$ is a non negative function, then \begin{equation*}\tag{3}E\{\sum_j f(j)Z_{k + n}(j) \mid Z_k = z\} = \sum_j zM^n(j)f(j) = \sum_i z(i)M^nf(i).\end{equation*} In [6] it is shown that there is a unique non negative number $r$ which is the common radius of convergence of the power series $\sum^\infty_{n = 1} m^{(n)}_{ij} s^n$. We shall always assume $r > 0$. In this paper we shall extend the far reaching theorem of Everett, Ulam and Harris for a branching process with finitely many types to a branching process with countably many types. Let us first describe the above mentioned theorem. For a branching process with $k$ types, the expectation matrix $(m_{ij})$ is a $k \times k$ matrix. The matrix is assumed to be positive regular. It has a largest positive eigenvalue $\rho$ with corresponding right and left eigenvectors $u, v$. If we normalize $Z_n$ by dividing by $\rho^n$, the sequence $\{Z_n\rho^{-n}\}$ of vector valued random variables converges to a vector valued random variable $W$ with probability 1, if $\rho > 1$. Furthermore, the direction of $W$, if $W \neq 0$, coincides with that of $v$. The theorem has been proven under some conditions on the existence of moments of $Z_1$ in [1], [2], [3]. And recently Keston and Stigum have proven the theorem under less stringent conditions [4]. In fact, the theorem has attained its best possible form in their work. All these works are based on the theory of ergodic behavior of $(m^{(n)}_{ij})$ which is described completely in the classical Perron Frobenius theory of positive matrices. For an infinite, irreducible, non negative $(m_{ij})$ the number $r^{-1}$ shall take the place of $\rho$ of the finite case, in view of the fact that the radii of convergence of series $\sum^\infty_{n = 1} m^{(n)}_{ij}s^n$ are all equal to $\rho^{-1}$ for the finite case. The ergodic behavior of $(m^{(n)}_{ij})$ for an infinite matrix is far more complicated than its counterpart for a finite matrix. However, it has been worked out in great detail in [5]. There are two different but exhaustive cases: CASE I, $\sum^\infty_{n = 1} m^{(n)}_{ij} r^n < \infty\quad\text{for all}\quad i, j$; CASE II. $\sum^\infty_{n = 1} m^{(n)} r^n = \infty\quad\text{for all}\quad i, j$. In Case II there are two strictly positive functions $u$ and $v$ such that \begin{equation*}\tag{4}rvM(i) = v(i)\quad\text{for all}\quad i;\end{equation*} \begin{equation*}\tag{5}rMu(i) = u(i)\quad\text{for all}\quad i.\end{equation*} They are unique (up to constant multiples), non negative functions satisfying (4) and (5) respectively. The sum $\sum_i u(i)v(i)$ may be finite or infinite. In either Case I or Case II with $\sum_i u(i)v(i) = \infty$ we have $\lim_{n \rightarrow \infty} m^{(n)}_{ij} r^n = 0$. This being the case, if the process is initiated by finitely many particles $(\mathrm{Pr} \{Z_0 = z\} = 1 \text{for some} z \varepsilon T),\\E\{r^nZ_n(j)\} = r^nzM^n(j) = r^n\sum_i z(i)m^{(n)}_{ij} \rightarrow 0$ as $n \rightarrow \infty$. Hence all sequences $\{r^nZ_n(j)\}$ converge in mean to 0. One might say that the conclusion of the theorem of E. U. H. remains valid with $W = 0$ and mean convergence instead of convergence with probability 1. The most interesting case we shall analyze in detail is Case II with $\sum_i u(i)v(i) < \infty$. In this case the infinite matrix behaves strikingly like a finite matrix. We are able to use an approach similar to that of Harris, which may be described as the mean square approach. A condition is imposed on distributions $p_i$ so that all $Z_n(i)$ have finite second moments. We remark that there are other varieties of conditions which may serve the same purpose, but the one we have chosen has the appeal of being simple in appearance. The main result is Theorem 1 in which we prove the mean square convergence of $\{r^nZ_n(j)\}$.

## Citation

Shu-Teh C. Moy. "Extensions of a Limit Theorem of Everett, Ulam and Harris on Multitype Branching Processes to a Branching Process with Countably Many Types." Ann. Math. Statist. 38 (4) 992 - 999, August, 1967. https://doi.org/10.1214/aoms/1177698767

## Information