A remarkable characterization of the normal law is that if $x$ and $y$ are two independent chance variables such that two linear functions, $ax + by (ab \neq 0)$ and $cx + dy (cd \neq 0)$, are distributed independently of each other, then both $x$ and $y$ are normally distributed. This theorem has been proved without any assumption about the existence of moments by Darmois , extending earlier results of Gnedenko  and Kac . The question that naturally arises is how far the condition of stochastic independence is necessary, or, in other words, whether the above theorem can be generalised after relaxing the condition of stochastic independence of the linear functions of two independent chance variables. But it is evident that we can always construct two linear functions of non-normal mutually independent chance variables such that they are not independent in the probability sense. In the present paper we shall investigate the nature of the distribution law that may be obtained by imposing the mild restriction of the linearity of regression of one linear function on the other, which is, of course, weaker than the assumption of stochastic independence. We shall prove a general theorem from which a number of results will follow as special cases. But it should be noted that the statements regarding regression or conditional expectation require the assumption that the conditional distribution function exist, and in the following, this assumption will be tacitly made wherever needed.
R. G. Laha. "On a Characterization of the Stable Law with Finite Expectation." Ann. Math. Statist. 27 (1) 187 - 195, March, 1956. https://doi.org/10.1214/aoms/1177728357