Cramér theorem for Gamma random variables

In this paper we discuss the following problem: given a random variable Z = X + Y with Gamma law such that X and Y are independent, we want to understand if then X and Y each follow a Gamma law. This is related to Cram´er’s theorem which states that if X and Y are independent then Z = X + Y follows a Gaussian law if and only if X and Y follow a Gaussian law. We prove that Cram´er’s theorem is true in the case of the Gamma distribution for random variables living in a Wiener chaos of ﬁxed order but the result is not true in general. We also give an asymptotic variant of our result.


Introduction
Cramér's theorem (see [1]) says that the sum of two independent random variables is Gaussian if and only if each summand is Gaussian. One direction is elementary to prove, that is, given two independent random variables with Gaussian distribution, then their sum follows a Gaussian distribution. The second direction is less trivial and its proof requires powerful results from complex analysis (see [1]). In this paper, we treat the same problem for Gamma distributed random variables. A Gamma random variable, denoted usually by Γ(a, λ), is a random variable with probability density function given by f a,λ (x) = λ a Γ(a) x a−1 e −λx if x > 0 and f a,λ (x) = 0 otherwise. The parameters a and λ are strictly positive and Γ denotes the usual Gamma function. It is well known that if X ∼ Γ(a, λ) and Y ∼ Γ(b, λ) and X is independent of Y , then X + Y follows the law Γ(a + b, λ). The purpose of this paper is to understand the converse implication, i.e. whether or not (or under what conditions), if X and Y are two independent random 1  variables such that X + Y ∼ Γ(a + b, λ) and E(X ) = E (Γ(a, λ)) , E X 2 = E Γ(a, λ) 2 and 2 , it holds that X ∼ Γ(a, λ) and Y ∼ Γ(b, λ). We will actually focus our attention on the so-called centered Gamma distribution F (ν). We will call 'centered Gamma' the random variables of the form where G(ν/2) := Γ(ν/2, 1) has a Gamma law with parameters ν/2, 1. This means that G(ν/2) is a (a.s. strictly positive) random variable with density g(x) = x ν 2 −1 e −x Γ(ν/2) 1 (0,∞) (x). The characteristic function of the law F (ν) is given by We will find the following answer: if X and Y are two independent random variables, each living in a Wiener chaos of fixed order (and these orders are allowed to be different) then the fact that the sum X + Y follows a centered Gamma distribution implies that X and Y each follow a Gamma distribution. On the other hand, for random variables having an infinite Wiener-Itô chaos decomposition, the result is not true even in very particular cases (for so-called strongly independent random variables). We construct a counter-example to illustrate this fact. Our tools are based on a criterion given in [6] to characterize the random variables with Gamma distribution in terms of Malliavin calculus. Our paper is structured as follows. Section 2 contains some notations and preliminaries. In Section 3 we prove the Cramér theorem for Gamma distributed random variables in Wiener chaos of finite orders and we also give an asymptotic version of this result. In Section 4 we show that the result does not hold in the general case.

Some notations and definitions
Let (W t ) t∈T be a classical Wiener process on a standard Wiener space (Ω, , P). If f ∈ L 2 (T n ) with n ≥ 1 integer, we introduce the multiple Wiener-Itô integral of f with respect to W . The basic references are the monographs [3] or [4]. Let f ∈ n be an elementary function with n variables that can be written as f = i 1 ,...,i n c i 1 ,...,i n 1 A i 1 ×...×A i n where the coefficients satisfy c i 1 ,...,i n = 0 if two indices i k and i l are equal and the sets A i ∈ (T ) are pairwise disjoint. For such a step function f we define where we put W (A) = 1 0 1 A (s)dW s . It can be seen that the application I n constructed above from n to L 2 (Ω) is an isometry on n in the sense and Since the set n is dense in L 2 (T n ) for every n ≥ 1 the mapping I n can be extended to an isometry from L 2 (T n ) to L 2 (Ω) and the above properties hold true for this extension.
It also holds that I n ( f ) = I n f wheref denotes the symmetrization of f defined bỹ σ running over all permutations of {1, ..., n}. We will need the general formula for calculating products of Wiener chaos integrals of any orders m, n for any symmetric integrands f ∈ L 2 (T m ) and g ∈ L 2 (T n ), which is where the contraction f ⊗ g is defined by Note that the contraction ( f ⊗ g) is an element of L 2 (T m+n−2 ) but it is not necessarily symmetric. We will denote its symmetrization by ( f⊗ g). We recall that any square integrable random variable which is measurable with respect to the σalgebra generated by W can be expanded into an orthogonal sum of multiple stochastic integrals where f n ∈ L 2 (T n ) are (uniquely determined) symmetric functions and I 0 ( f 0 ) = E (F ). We denote by D the Malliavin derivative operator that acts on smooth functionals of the form F = g(W (ϕ 1 ), . . . , W (ϕ n )) (here g is a smooth function with compact support and ϕ i ∈ L 2 (T ) for i = 1, .., n) We can define the i-th Malliavin derivative D (i) iteratively. The operator D (i) can be extended to the closure D p,2 of smooth functionals with respect to the norm The adjoint of D is denoted by δ and is called the divergence (or Skorohod) integral. Its domain Dom(δ) coincides with the class of stochastic processes u ∈ L 2 (Ω × T ) such that |E (〈DF, u〉)| ≤ c F 2 for all F ∈ D 1,2 and δ(u) is the element of L 2 (Ω) characterized by the duality relationship For adapted integrands, the divergence integral coincides with the classical Itô integral. Let L be the Ornstein-Uhlenbeck operator defined on Dom(L) = D 2,2 . We have if F is given by (5). There exists a connection between δ, D and L in the sense that a random variable F belongs to the domain of L if and only if F ∈ D 1,2 and DF ∈ Dom(δ) and then δDF = −L F . Let us consider a multiple stochastic integral I q ( f ) with symmetric kernel f ∈ L 2 (T q ). We denote the Malliavin derivative of I q ( f ) by DI q ( f ). We have where f (θ ) = f (t 1 , ..., t q−1 , θ ) is the (q − 1) th order kernel obtained by parametrizing the q th order kernel f by one of the variables. For any random variable X , Y ∈ D 1,2 we use the following notations and Finally, we will use the notation X ⊥Y to denote that two random variables X and Y are independent.
The following facts are key points in our proofs: are symmetric functions. Then X and Y are independent if and only if (see [8]) . Then X follows a centered Gamma law F (ν) with ν > 0 if and only if (see [5]) 2ν. Then the sequence X k = I q ( f k ) converges in distribution, as k → ∞, to a Gamma law, if and only if (see [5]) Remark: In this particular paper, we will restrict ourselves to an underlying Hilbert space (to the Wiener process we will be working with in the upcoming sections) of the form H = L 2 (T ) for the sake of simplicity. However, all the results presented in the upcoming sections remain valid on a more general separable Hilbert space as the underlying space.

(Asymptotic) Cramér theorem for multiple integrals
In this section, we will prove Cramér's theorem for random variables living in fixed Wiener chaoses. More precisely, our context is as follows: we assume that X = I q 1 ( f ) and Y = I q 2 (h) and X , Y are independent. We also assume that E X 2 = E F (ν 1 ) 2 = 2ν 1 and E Y 2 = E F (ν 2 2 ) = 2ν 2 . Here ν, ν 1 , ν 2 denotes three strictly positive numbers such that ν 1 + ν 2 = ν. We assume that X + Y follows a Gamma law F (ν) and we will prove that X ∼ F (ν 1 ) and Y ∼ F (ν 2 ). Let us first give the following two auxiliary lemmas that will be useful throughout the paper. Lemma 1. Let q 1 , q 2 ≥ 1 be integers, and let X = I q 1 ( f ) and Y = I q 2 (h), where f ∈ L 2 (T q 1 ) and h ∈ L 2 (T q 2 ) are symmetric functions. Assume moreover that X and Y are independent. Then, we have DX ⊥DY , X ⊥DY and Y ⊥DX .
Let us now prove that DX ⊥DY . Since for every θ , ψ ∈ T , D θ X = q 1 I q 1 −1 ( f (θ ) ) and D ψ Y = q 2 I q 2 −1 (h (ψ) ), it suffices to show that the random variables I q 1 −1 ( f (θ ) ) and I q 2 −1 (h (ψ) ) are independent. To do this, we will use the criterion for the independence of multiple integrals given in [8].
We need to check that f (θ ) ⊗ 1 h (ψ) = 0 a.e. on T q 1 +q 2 −4 and this follows from above. It remains to prove that X ⊥DY and DX ⊥Y . Given the symmetric roles played by X and Y , we will only prove that X ⊥DY . That is equivalent to the independence of the random variables I q 1 ( f ) and I q 2 −1 (h (ψ) ) for every θ ∈ T , which follows from [8] (see Fact 1 in Section 2) and (6). Thus, we have X ⊥DY and DX ⊥Y .
Let us recall the following definition (see [7]).

Definition 1.
Two random variables X = n≥0 I n ( f n ) and Y = m≥0 I m (h m ) are called strongly independent if for every m, n ≥ 0, the random variables I n ( f n ) and I m (h m ) are independent.
We have the following lemma about strongly independent random variables.
Proof: We have, for every θ ∈ T , Therefore, we can write The strong independence of X and Y gives us that f (θ ) n ⊗ r h (θ ) m = 0 for every 1 ≤ r ≤ (n−1)∧(m−1). Thus, we obtain Using a Fubini type result, we can write Again, the strong independence of X and Y gives us that f n ⊗ 1 h m = 0 a.e and we finally obtain Let us first remark that the Cramér theorem holds for random variables in the same Wiener chaos of fixed order. . Fix ν 1 , ν 2 , ν > 0 such that ν 1 + ν 2 = ν. Assume that X + Y follows the law F (ν) and X is independent of Y . Also suppose that E X 2 = E F (ν 1 ) 2 = 2ν 1 and E Y 2 = E F (ν 2 ) 2 = 2ν 2 . Then X ∼ F ν 1 and Y ∼ F ν 2 .
Proof: By a result in [5] (see Fact 2 in Section 2), X + Y follows the law F (ν) is equivalent to On the other hand Above we used the fact that 〈DI m ( f ), DI m (h)〉 L 2 (T ) = 0 as a consequence of Lemma 1. It is also easy to remark that, from Lemma 1 Using this and by combining (7) and (8), we obtain that The left hand side of this last equation is equal to zero and is the sum of two non negative quantities (as expectations of squares). This implies that each of the summands are equal to zero. Thus, and consequently X ∼ F (ν 1 ) and Y ∼ F (ν 2 ).

Remark 1. We mention that the above Proposition 1 is a particular case of Theorem 3. We prefer to state it and prove it separately because its proof is simpler and does not require the techniques used in the proof of Theorem 3. Using Fact 3 in Section 2, an asymptotic variant of the above result can be stated. We will state it here because it is a particular case of Theorem 4 proved later in our paper.
Theorem 1.2 in [5] gives a characterization of (asymptotically) centered Gamma random variable which are given by a multiple Wiener-Itô integral. There is not such a characterization for random variable living in a finite or infinite sum of Wiener chaos; only an upper bound for the distance between the law of a random variable in D 1,2 and the Gamma distribution has been proven in [6], Theorem 3.11. It turns out, that for the case of a sum of independent multiple integrals, it is possible to characterize the relation between its distribution and the Gamma distribution. We will prove this fact in the following theorem. Theorem 1. Fix ν 1 , ν 2 , ν > 0 such that ν 1 + ν 2 = ν and let F (ν) be a real-valued random variable with characteristic function given by (1). Fix two even integers q 1 ≥ 2 and q 2 ≥ 2. For any symmetric kernels f ∈ L 2 (T q 1 ) and h ∈ L 2 (T q 2 ) such that and such that X = I q 1 ( f ) and Y = I q 2 (h) are independent, define the random variable Under those conditions, the following two conditions are equivalent:

is the Malliavin derivative operator and L is the infinitesimal generator of the Ornstein-Uhlenbeck semigroup;
(ii) Z Law = F (ν); Proof: Proof of (ii) → (i). Suppose that Z ∼ F (ν). We easily obtain that Consequently, Then we will use the fact that for every multiple integral I q ( f ) and We will now compute E Z 3 , E Z 4 and E Z 4 − 12E Z 3 by using the above two relations (12) and (13). We have (h) and thus, by using the independence between I q 1 ( f ) and I q 2 (h), Using relation (12), we can write For E Z 4 , we combine relations (9) and (13) with the independence between I q 1 ( f ) and I q 2 (h) to obtain Using the fact that q 1 ! f 2 L 2 (T q 1 ) = 2ν 1 and q 2 ! h 2 L 2 (T q 2 ) = 2ν 2 , we can write Recall that ν = ν 1 + ν 2 and note that 12ν 2 1 + 12ν 2 2 − 48ν 1 − 48ν 2 + 24ν 1 ν 2 = 12ν 2 − 48ν. Also note that

and a similar relation holds for the function
From (ii), it follows that 3 q 1 which leads to the conclusion as all the summands are positive, that is This implies for every p = 1, ..., q 1 − 1 such that p = q 1 /2 and for every r = 1, ..., q 2 − 1 such that r = q 2 /2 (see [5], Theorem 1.2.). We will compute E 2ν + 2Z − G Z 2 . Let us start with G Z .
We use Lemma 1 to write Thus, Relation (17) and the calculations contained in [5] imply that the above two summands vanish. It finally follows from this that Proof of (i) → (ii). Suppose that (ii) holds. We have proven that From Theorem 1.2 in [5] it follows that I q 1 ( f ) ∼ F (ν 1 ) and I q 2 (h) ∼ F (ν 2 ). I q 1 ( f ) and I q 2 (h) being independent, we use the convolution property of Gamma random variables to state that
Following exactly the lines of the proof of Theorem 1 it is possible to characterize random variables given by a sum of independent multiple integrals that converges in law to a Gamma distribution.
Theorem 2. Fix ν 1 , ν 2 , ν > 0 such that ν 1 + ν 2 = ν and let F (ν) be a real-valued random variable with characteristic function given by (1). Fix two even integers q 1 ≥ 2 and q 2 ≥ 2. For any sequence ( f k ) k≥1 ⊂ L 2 (T q 1 ) and (h k ) k≥1 ⊂ L 2 (T q 2 ) ( f k and h k are symmetric for every k ≥ 1) such that and such that X k = I q 1 ( f k ) and Y k = I q 2 (h k ) are independent for any k ≥ 1, define the random variable Under those conditions, the following two conditions are equivalent: Cramér's theorem for Gamma random variables in the setting of multiple stochastic integrals is a corollary of Theorem 1. We have the following : symmetric, be such that X , Y are independent and Proof: Theorem 1 states that Z ∼ F (ν) ⇔ E 2ν + 2Z − G Z 2 = 0 and we proved that Both summands being positive, it follows that Applying theorem 1 to X and Y separately gives us E 2ν 1 + 2X − G X 2 ⇔ X ∼ F (ν 1 ) and It is immediate to give an asymptotic version of Theorem 3.
, h k ∈ L 2 (T q 2 ) symmetric for k ≥ 1, q 1 , q 2 ≥ 2, be such that X k , Y k are independent for every k ≥ 1 and [5]  ii) Theorem 3 cannot be applied directly to random variables with law Γ(a, λ) (as defined in the Introduction) because such random variables are not centered and then they cannot live in a finite Wiener chaos. But, it is not difficult to understand that if X = I q 1 + c is a random variable which is independent of Y = I q 2 + d (and assume that the first two moments of X and Y are the same as the moment of the corresponding Gamma distributions), and if X + Y ∼ Γ(a + b, λ) then X has the distribution Γ(a, λ) and Y has the distribution Γ(b, λ). iii) Several results of the paper (Lemmas 1 and 2) holds for strongly independent random variables. Nevertheless, the key results (Theorems 1 and 2 that allows to prove Cramér's theorem and its asymptotic variant are not true for strongly independent random variables (actually the implication ii) → i) in these results, whose proof is based on the differential equation satisfied by the characteristic function of the Gamma distribution, does not work.

Counterexample in the general case
We will see in this section that Theorem 3 does not hold for random variables which have a chaos decomposition into an infinite sum of multiple stochastic integrals. We construct a counterexample in this sense. What is more interesting is that the random variables defined in the below example are not only independent, they are strongly independent (see the definition above).

Example 1.
Let ε(λ) denote the exponential distribution with parameter λ and let b(p) denote the Bernoulli distribution with parameter p. Let X = A− 1 and Y = 2 B − 1, where A ∼ ε(1), B ∼ ε(1), ∼ b( 1 2 ) and A, B and are mutually independent. This implies that X and Y are independent. We have E(X ) = E(Y ) = 0 as well as E(X 2 ) = 1 and E(Y 2 ) = 3. Consider also Z = X + Y . Observe that X ,Y and Z match every condition of theorem 3, but X and Y are not multiple stochastic integrals in a fixed Wiener chaos (see the next proposition for more details). We have the following : Z ∼ F (2), but Y is not Gamma distributed.
Proof: We know that E e i t X = E e i t(A−1) = e −i t E e i tA = e −i t 1 − i t