A Connection between Gaussian Processes and Markov Processes

: The Green function of a transient symmetric Markov process can be interpreted as the covariance of a centered Gaussian process. This relation leads to several fruitful identities in law. Symmetric Markov processes and their associated Gaussian process both beneﬂt from these connections. Therefore it is of interest to characterize the associated Gaussian processes. We present here an answer to that question.


-Introduction and main result
Symmetric Markov processes are linked to the world of Gaussian processes through a very powerful remark : each transient symmetric Markov process X has a finite Green function which can be interpreted as the covariance of a centered Gaussian process η. This simple remark leads to several identities involving the law of η and the law of the local time process of X. The first of these identities is the isomorphism theorem of Dynkin [4] (1983). Then an unconditioned form of this theorem has been established in 1995 [6]. Another identity concerns exclusively Markov processes killed at first hitting times [8]. They are many examples of successful uses of these identities (see the results of Marcus and Rosen [11] [12] or Bass et al. [2] ) which are exploited to study properties of the local time process using similar properties of the associated Gaussian process η. But in [5] we see that this connection can also be efficient as a tool to solve questions about the associated Gaussian processes. The question of characterizing these Gaussian processes has been open since the isomorphism theorem has been proved. A recent result has provided a first answer to that question. Indeed, in 1984 Griffiths [9] answered to the question first raised by Lévy (1948) of the infinite divisibility of squared centered Gaussian processes. He established a characterization of the p-dimensional centered Gaussian vectors (φ 1 , φ 2 , ..., φ p ) such that the vector of the squares (φ 2 1 , φ 2 2 , ..., φ 2 p ) is infinitely divisible. His criterion is difficult to use since it requires the computation of the signs of the cofactors of the covariance matrix. Although a simpler reformulation by Bapat in 1989 [1], there were no examples, in the literature, illustrating the use of this criterion. In [5] and [7], we have shown, making use of Griffiths and Bapat's results, that a centered Gaussian process with a continuous covariance G, has an infinitely divisible square, if and only if where g is the Green function of a transient Markov process and d is a positive function. This first answer shows that the collection of covariance functions that correspond to infinitely divisible squared Gaussian processes is richer than the set of Green functions of Markov processes. We could actually formulate in [7], Theorem 3.6, a characterization of the associated Gaussian processes.
But this characterization is hard to use. The question remains : Is there a remarkable property for the Gaussian processes that would select precisely covariance functions that coincide with Green functions ? The following theorem gives an answer to that question. For simplicity, the Gaussian processes considered here are indexed by R but the results are still available if R is replaced by any separable locally compact metric space E.
be a centered Gaussian process with a continuous definite positive covariance G. Assume that there exists a in R such that η a = 0. Then the process ((η x + c) 2 , x ∈ R) is infinitely divisible for any constant c, if and only if where (g Ta (x, y), (x, y) ∈ R 2 ) is the Green function of a recurrent Markov process killed at its first hitting time of a.
Haya Kaspi and Yves Le Jan have then mentioned to us that the Green function of a transient symmetric Markov process X with a state space E, can be seen as the restriction to E of the Green function of a symmetric Markov processX killed at its first hitting time of some cimetery point δ outside E. We refer to Dellacherie and Meyer's book [3] chapter XII, p.62, and to Le Jan [10] for the explicite construction ofX. In Section 3, we show how X can be chosen in order to fulfil the assumption required by Dellacherie and Meyer's construction. Consequently, we can enunciate the following theorem.
Theorem 1.2 : Let (η x , x ∈ E) be a centered Gaussian process with a continuous definite positive covariance. Then (η x , x ∈ E) is an associated Gaussian process, if and only if, for any constant c, the process ((η x +c) 2 , x ∈ E) is infinitely divisible.
In Section 2, we analyze this result and give several corollaries. In particular, note that Griffiths criterion as well as (1) have been established for Gaussian processes that are centered. It is then natural to ask if there is any characterization of the non-centered Gaussian processes with an infinitely divisible square. Corollary 2.1 provides a sufficient condition for this property. Besides, we extend to non-centered Gaussian couples, the result of Vere-Jones [13] establishing the infinite divisibility of all the squared Gaussian centered couples. The proofs are given in Section 3. The proof of Theorem 1.1 puts in evidence the following fact . (ii) Let N be a standard Gaussian variable independent of η. The process (η + N ) 2 is infinitely divisible.

-Analysis and corollaries
Let β be a real valued Brownian motion starting from 0. For any constant c, the process (β + c) 2 is infinitely divisible. This property is a consequence of the existence of the family of squared Bessel processes. More generally, let η be a centered Gaussian process such that there exists a fixed a with η a = 0. In [5], we have shown that if (η +c) 2 is infinitely divisible for any constant c then there exists a doubly-indexed family and having the following additivity property where Y d,c and Y d ,c are chosen independently. Conversely, the existence of such a family satisfying (3) and (4), implies the infinite divisibility of (η + √ c) 2 .
Hence we see that Theorem 1.1 provides a necessary and sufficient condition for the existence of such a family. In particular, the additivity property of the squared Bessel processes can also be seen as a consequence of the fact that β has a covariance equal to the Green function of a real Brownian motion killed at its first hitting time of 0. In the case when (η + c) 2 is infinitely divisible, the proof of Theorem 1.1 will show moreover the special part played by Y 0,c , the "0-dimensional" process associated to η.

205
The following corollary gives a sufficient condition for the infinite divisibility of the squared non-centered Gaussian processes.
be a Gaussian process with a continuous covariance (G(x, y), (x, y) ∈ R 2 ) and a continuous expectation (IE(η x ), x ∈ R). Assume that there exists a in R such that η a = 0 and for any x ∈ R \ {a}, is the Green function of a recurrent Markov process killed at its first hitting time of a, then the process (η 2 x , x ∈ R) is infinitely divisible.
Is the above sufficient condition also necessary ? The proof of Corollary 2.1 will actually show that solving that question is equivalent to show that the propositions (i) and (ii) of Theorem 1.3 are also equivalent to the following proposition (iii).
This question is still unsolved.
In 1967, Vere-Jones [13] has proved that for any centered Gaussian couple (η x , η y ), the couple (η 2 x , η 2 y ) is infinitely divisible. Here is a version of his result for non-centered Gaussian couples. Note that the case of couples deserves a special treatment since it is excluded from the general criterions such as Bapat's or Griffiths.
The following remark is an immediat consequence of Corollary 2.2.
Remark 2.3 : Let (η x , η y ) be a centered Gaussian couple with a covariance matrix G. Then we have the following properties.
• The couple ((η x + c x ) 2 , (η y + c y ) 2 ) is infinitely divisible for any couple of constants (c x , c y ) if and only if η x and η y are independent.
Thanks to Theorem 1.1, we can reformulate (1) as follows, putting in evidence a special class of Green functions.
where (g(x, y), (x, y) ∈ R 2 ) is the Green function of a recurrent Markov process killed at the first time its local time at a exceeds an independent exponential time with mean 1.

-Proofs
To lighten the notations, we assume without loss of generality that a = 0. Proof of Theorem 1.1 : We first prove the sufficiency of the condition (2). Let (g T 0 (x, y), (x, y) ∈ R 2 ) be the Green function of a recurrent Markov process X killed at its first hitting time of 0. Assume that g T 0 is symmetric (this is a lighter assumption than the one of Theorem 1.1). This simple assumption is sufficient to claim that g T 0 is hence positive definite (see for example Marcus and Rosen [11] or Eisenbaum [4]). Let (η x , x ∈ R) be a centered Gaussian process, independent of X, with a covariance equal to g T 0 .
Denote by (L x t , x ∈ R, t ≥ 0) the local time process of X. For any r > 0, we set τ r = inf{t ≥ 0 : L o t > r}. We use now the following identity established in [8] (L x τr + η 2 On one hand, we know, thanks to Theorem 3.2 in [5], that η 2 is infinitely divisible. On the other hand, since L is an additive functional, we have for any r, t > 0 L . τ r+t = L . τr + L . τt oθ τr . By the Markov property of X, this implies the infinite divisibility of the process (L x τr , x ∈ R). Consequently, for any constant c, the process ((η x + c) 2 , x ∈ R) is infinitely divisible. This property remains valid for any centered Gaussian process with a covariance equal to g T 0 .
To prove the necessity of the condition (2), we note that if (η +c) 2 is infinitely divisible for any constant c then the process ((η x + N ) 2 , x ∈ R), where N is a standard Gaussian variable independent of η, is infinitely divisible too. Indeed, we use the following lemma established in [5] (Lemma 4.1 in [5]).
Lemma A : Let (η x ) x∈E be a centered Gaussian process such that η 0 = 0. We have then : x∈E is infinitely divisible for any real number r then for any b > 0, there exists a process (Z b (x)) x∈E independent of (η x ) x∈E such that Z b (0) = b and for any finite subset F of E Here for any real c and any finite subset F of R, we have Integrating the above equation with respect to the law of N , gives We set then be a sequence of positive real numbers. The Laplace transform of (η x + r) 2 x∈F has the following expression.
where c(α, x) and f (α, x) are two constants given by Consequently The random variable N 2 is infinitely divisible , hence for any k ∈ N * , there exists k i.i.d positive variables X 1 , X 2 , ..., X k such that We obtain x , x ∈ F ) is also infinitely divisible, then ((η x + N ) 2 , x ∈ F ) is infinitely divisible. The centered Gaussian process ((η x + N ), x ∈ R) has a covariance equal to (G(x, y) + 1, (x, y) ∈ R 2 ). Note that (G(x, y) + 1, (x, y) ∈ R 2 ) is continuous positive definite. Hence one can use (1), to claim that where d is a strictly positive function and g is the Green function of a transient Markov process X.
We note then that d(x) = √ g(0,0) g(x,0) . Hence we have G(x, y) + 1 = g(0, 0)g(x, y) g(x, 0)g(y, 0) . 209 Let U be the Green operator admitting (g(x, y), (x, y) ∈ R 2 ) as densities with respect to a reference measure µ We set m(dy) = g 2 (y,0) g(0,0) µ(dy). With respect to m, the operator U admits the densitiesg(x, y) = g(0,0) g 2 (y,0) g(x, y). Rewriting (8) in terms of the Green functioñ g, we obtain The right hand term of the above equation is the Green function g (with respect to the reference measure m) of the following h-path transform of X where F t denotes the field generated by (X s , 0 ≤ s ≤ t) and IP 0 the probability under which X starts at 0. . Under IP 0 , the process X starts at 0 and is killed at its last visit to 0. Similarly the probability IP a is defined by Under IP a , X starts at a and is killed at its last visit to 0. Let (L x t , x ∈ R, t ≥ 0) be the local time process of X, T 0 be the first hitting time of 0 by X and λ 0 its last visit to 0. By construction : IE x (L y λ 0 ) = g(x, y). Note that IP x (T 0 < ∞) = g(x, 0)/g(0, 0) = 1, for any x.
We have under IP x L y λ 0 = L y T 0 + L y λ 0 oθ T 0 Taking the expectation of both sides with respect to IP x , we have g(x, y) = IE x (L y T 0 ) + g(0, y), but since for any y : g(0, y) = 1, we finally obtain g(x, y) = IE x (L y T 0 ) + 1.
Consequently : G(x, y) = IE x (L y T 0 ). The covariance G is equal to the Green function of the above h-path transform of X killed at its first hitting time of 0. But this Markov process is obviously not recurrent. Note that We associate now to the transient process X, a recurrent Markov processX by the same way it has been done in [5] (proof of Theorem 5.1). The procesŝ X is constructed from the finite excursions of X around 0. As a consequencê X killed at its first hitting time of 0 coincides with X killed at T 0 conditioned to die at 0, i.e. the h-path transform of X T 0 , with h(x) = IP x (T 0 < ∞). This implies that the Green functionĝ T 0 ofX killed at its first hitting time of 0, is equal to Proof of Theorem 1.2 : Let (g(x, y), x, y ∈ R) be the Green function of transient symmetric Markov process. The proof of Theorem 3.4 in [7] shows clearly that there exists a finite positive measure m on R and a transient symmetric Markov process X such that its 0-potential is given by Denote its resolvant by (U p ) p≥0 . The construction ofX by Dellacherie and Meyer requires the existence of an excessive measure µ such that µ(I −pU p ) is bounded for every p > 0. We set : λ(dx) = e −|x| g(x,x) dx, and define the measure µ by µ = λU 0 . The measure µ is excessive (see Dellacherie Meyer chapter XII, 37 d), p.21). The resolvant equation gives : U 0 (I − pU p ) = U p . Hence As a Green function, g satisfies: g(x, y) ≤ g(x, x). Consequently we have Moreover, we easily check from the definition ofX given by Dellacherie and Meyer, thatX is a Feller Process. 2 Proof of Corollary 2.1: We set : ψ x = η x − IE(η x ). The centered Gaussian process ψ has a covariance equal to G. The process η 2 is infinitely divisible, if and only if ((ψ x + IE(η x )) 2 , x ∈ R) is infinitely divisible. This is equivalent to the infinite divisibility of (( ψx IE(ηx) + 1) 2 , x ∈ R * ). But by assumption, we know that : IE( ψx IE(ηx) ψy IE(ηy) ) = g Ta (x, y). Hence, thanks to Theorem 1.1, (( ψx IE(ηx) + c) 2 , x ∈ R * ) is infinitely divisible for any c. Consequently η 2 is infinitely divisible. 2 Proof of Corollary 2.2 : We first assume that detG = 0. The proof is based on the following preliminary result. (ii) Let N be a standard Gaussian variable independent of (η x , η y ). The vector ((η x + N ) 2 , (η y + N ) 2 , N 2 ) is infinitely divisible.
Indeed, to see that (ii) is a consequence of (i), we note that Lemma A used in the proof of Theorem 1.1, is still true if the Gaussian process (η x ) x∈E is replaced by a vector (η x , η y ). Hence for any b there exists a couple (Z b 2 (x), Z b 2 (y)) independent of (η x , η y ) such that Integrating the above equation with respect to the law of N , one obtains We then finish the argument similarly to the proof of Theorem 1.1 to conclude that ((η x + N ) 2 , (η y + N ) 2 , N 2 ) is infinitely divisible. Conversely if ((η x +N ) 2 , (η y +N ) 2 , N 2 ) is infinitely divisible, we set η z = 0 and note that the covariance matrix of the Gaussian vector (η x +N, η y +N, η z +N ) is positive definite. Thanks to Theorem 3.2 of [7], we know hence that there exists a real valued function d on {x, y, z} such that for any a, b in {x, y, z} where the function g is the Green function of a transient symmetric Markov process. Consequently, we have We remark that d must have a constant sign on {x, y, z}. We can hence assume that d is strictly positive. Then we just have to reproduce the proof of Theorem 1.1 from (8) till the end, to obtain the infinite divisibility of ((η x + c) 2 , (η y + c) 2 , (η z + c) 2 ) for any constant c. 2 We use then the criterion of Bapat [1] to see with elementary arguments that the Gaussian vector ((η x + N ) 2 , (η y + N ) 2 , N 2 ) is infinitely divisible if and only if IE(η x η y ) ≥ 0 and IE(η x η y ) ≤ IE(η 2 x ) ∧ IE(η 2 y ). Assume now that detG = 0. Excluding the case η x η y = 0, we know that η y = IE(ηxηy) IE(η 2 x ) η x . Suppose that ((η x +c) 2 , (η y +c) 2 ) is infinitely divisible for any constant c. This is hence equivalent to suppose that ((N + c) 2 , (N + λc) 2 ) is infinitely divisible for any constant c, where N is a standard Gaussian variable and λ = IE(η 2 x ) IE(ηxηy) . This assumption implies that ((N + c) 2 , (N + c) 2 − (N + λc) 2 ) is infinitely divisible for any constant c. Excluding the case λ = 1, this last assertion is equivalent to the infinite divisibility of ((N + c) 2 , 2N + c(1 + λ)) for any constant c. Letting c tend to 0, this implies that the couple (N 2 , N ) is infinitely divisible. To show that this last assertion is false, we give an argument suggested by Emmanuel Roy. Suppose that (N 2 , N ) is infinitely divisible. Thanks to the Lévy-Khintchine formula in R 2 , there exist a Gaussian couple (N 1 , N 2 ) and a couple of (X 1 , X 2 ) of Poissonian variables (ie without Gaussian components) such that (N 1 , N 2 ) and (X 1 , X 2 ) are independent and (N 2 , N ) = (N 1 , N 2 ) + (X 1 , X 2 ). The variable N 2 is strictly positive, hence N 1 = 0. Besides N is Gaussian hence X 2 = 0. Consequently, N 2 and N have to be independent, which is absurd. Finally note that the case λ = 1, corresponds to the case η x = η y . Obviously ((η x + c) 2 , (η x + c) 2 ) is infinitely divisible. Besides it is the only case, with the case η x η y = 0, when we have the two conditions detG = 0 and 0 ≤ IE(η x η y ) ≤ IE(η 2 x ) ∧ IE(η 2 y ) satisfied. 2 Proof of Corollary 2.4 : For any x, we have : η x = G(x,0) G(0,0) η 0 + ψ x . The Gaussian process ψ is centered, independent of η 0 and ψ 0 = 0. We set N = η 0 / G(0, 0). Since the process ( √ G(0,0) G(x,0) ψ x +N ) 2 x∈R is infinitely divisible, Theorem 1.3 and Theorem 1.1 lead to G(x, y) = G(x, 0) G(0, 0) (g T 0 (x, y) + 1) G(y, 0) where g T 0 is the Green function of recurrent symmetric Markov process X killed at the first hitting time of 0. We then note that (g T 0 + 1) is the Green function of X killed at the first time its local time at 0 is greater than an independent exponential time with mean 1. 2