Intermittency and infinite variance: the case of integrated supOU processes

SupOU processes are superpositions of Ornstein-Uhlenbeck type processes with a random intensity parameter. They are stationary processes whose marginal distribution and dependence structure can be specified independently. Integrated supOU processes have then stationary increments and satisfy central and non-central limit theorems. Their moments, however, can display an unusual behavior known as ``intermittency''. We show here that intermittency can also appear when the processes have a heavy tailed marginal distribution and, in particular, an infinite variance.


Introduction
Superpositions of Ornstein-Uhlenbeck type (supOU) processes form a rich class of stationary processes with a flexible dependence structure. Their distribution is determined by the characteristic quadruple (a, b, µ, π), where (a, b, µ) is some Lévy-Khintchine triplet (see e.g. Sato (1999)) and π is a probability measure on R + . In the construction of the supOU process {X(t), t ∈ R}, the choice of (a, b, µ) uniquely characterizes the one-dimensional marginals which are independent on the choice of π. On the other hand, the probability distribution π affects the dependence structure. See Barndorff-Nielsen (2001), Barndorff-Nielsen & Stelzer (2011), Barndorff-Nielsen & Stelzer (2013), Barndorff-Nielsen & Veraart (2013), Barndorff-Nielsen et al. (2018), Grahovac, Leonenko, Sikorskii & Taqqu (2019) for details. SupOU processes provide models with analytically and stochastically tractable dependence structure displaying either weak or strong dependence and also having marginal distributions that are infinitely divisible. They have applications in environmental studies, ecology, meteorology, geophysics, biology, see e.g. Barndorff-Nielsen et al. (2015), Podolskij (2015), Barndorff-Nielsen et al. (2018) and the references therein. The supOU processes are particularly relevant in finance and the statistical theory of turbulence since they can model key stylized features of observational series from finance and turbulence. Recently in Kelly et al. (2013), the supOU process have been used to assess the mass of black hole.
1 By aggregating the supOU process {X(t), t ∈ R} one obtains the integrated supOU process X * (t) = t 0 X(s)ds. (2) A suitably normalized integrated process exhibits complex limiting behavior. Indeed, if the underlying supOU process has finite variance, then four classes of processes may arise in a classical limiting scheme ). Namely, the limit process may be Brownian motion, fractional Brownian motion, a stable Lévy process or a stable process with dependent increments. The type of limit depends on whether Gaussian component is present in (1), on the behavior of π in (1) near origin and on the growth of the Lévy measure µ in (1) near origin (see  for details). In the infinite variance case, the limiting behavior is even more complex as the limit process may additionally depend on the regular variation index of the marginal distribution (see Grahovac et al. (2018) for details). The limiting behavior of the integrated process has practical significance since supOU processes may be used as stochastic volatility models, see Barndorff-Nielsen (1997), Barndorff-Nielsen & Shephard (2001) and the references therein. In this setting the integrated process X * represents the integrated volatility (see e.g. Barndorff-Nielsen & Stelzer (2013)). Moreover, the limiting behavior is important for statistical estimation (see Stelzer et al. (2015), Nguyen & Veraart (2018)).
The integrated supOU process may exhibit another interesting limiting property related to behavior of their absolute moments in time. Although a suitably normalized integrated process satisfies a limit theorem, it may happen than its moments do not converge beyond some critical order. One way to investigate this behavior is to measure the rate of growth of moments by the scaling function, defined for the process Y = {Y (t), t ≥ 0} as assuming the limit in (3) exists and is finite. We will often focus on τ (q) q = lim t→∞ log (E|Y (t)| q ) 1/q log t which has the advantage of involving (E|Y (t)| q ) 1/q which has the same units as Y (t). The values q are assumed to be in the range of finite moments q ∈ (0, q(Y )), where q(Y ) = sup{q > 0 : E|Y (t)| q < ∞ ∀t}.
Different scaling procedure play a key role in physics de Gennes (1979), risky asset modeling Heyde (2009), statistics Grahovac et al. (2015) and ambit stochastics Barndorff-Nielsen et al. (2018). To see how this is related to limit theorems, suppose that Y satisfies a limit theorem in the form with A T a sequence of constants and convergence in the sense of convergence of all finitedimensional distributions as T → ∞. By Lamperti's theorem (see, for example, (Pipiras & Taqqu 2017, Theorem 2.8.5)), the limit Z is H-self-similar for some H > 0, that is, for any constant c > 0, the finite-dimensional distributions of Z(ct) are the same as those of c H Z(t). Moreover, the normalizing sequence is of the form A T = ℓ(T )T H for some ℓ slowly varying at infinity. For self-similar process, the moments evolve as a power function of time since E|Z(t)| q = E|Z(1)| q t Hq and therefore the scaling function of Z is τ Z (q) = Hq. If the convergence of moments would hold then the scaling function of Y would also be τ Y (q) = Hq (see (Grahovac, Leonenko, Sikorskii & Taqqu 2019, Theorem 1). In particular, q → τ Y (q)/q would be constant over q values for which (4) holds.
It has been showed in Grahovac, Leonenko, Sikorskii & Taqqu (2019) that the integrated supOU process X * may have a scaling function which does not correspond to some self-similar process, namely τ X * (q) = q − α for a certain range of q. This happens, in particular, for a non-Gaussian integrated supOU process with marginal distribution having exponentially decaying tails and probability measure π in (1) regularly varying at zero. This implies that the function is not constant. It is strictly increasing, a property referred to as intermittency. Hence, intermittency implies the convergence of moments (4) must fail to hold beyond some critical value of q. See Grahovac et al. (2016), Grahovac, Leonenko, Sikorskii & Taqqu (2019),  which provide a complete picture on the behavior of moments in the case where X * (t) has finite variance. Intermittency refers in general to an unusual moment behavior. It is of major importance in many fields of science, such as the study of rain and cloud studies, magnetohydrodynamics, liquid mixtures of chemicals and physics of fusion plasmas, see e.g. Zel'dovich et al. (1987). Another area of possible application is turbulence. In turbulence, the velocities or velocity derivatives (or differences) under a large Reynolds number could be modeled with infinitely divisible distributions, they allow long range dependence and there seems to exist a kind of switching regime between periods of relatively small random fluctuation and period of "higher" activity. This phenomenon is also referred to as intermittency, see e.g. (Frisch 1995, Chapter 8) or Zel'dovich et al. (1987).
In this paper we focus on the limiting behavior of moments and on intermittency in the case where X * (t) has infinite variance and show that we can have intermittency even in this case.
To establish the rate of growth of moments we make use of the limit theorems established in Grahovac et al. (2018). The type of the limiting process depends heavily on the structure of the underlying supOU process. Hence, the form of the scaling function of the integrated process will depend on the several parameters related to the quadruple (1). Special care is needed since the range of finite moments is limited. We show that the scaling function may look like a broken line indicating that there is a change-point in the rate of growth of moments. Hence, infinite variance integrated supOU processes may also exhibit the phenomenon of intermittency. Our results also indicate that in some cases, if we decompose the process into several components, the intermittency of the finite variance component may remain hidden by the infinite moments of the infinite variance component. We conclude that moments may have limited capability in identifying unusual limiting behavior.
The paper is organized as follows. In Section 2 we introduce notation and assumptions. Section 3 introduces the decomposition of the process which serves as a basis for the further analysis. The scaling functions of the components in this decomposition are obtained in Section 4. These results are then combined in Section 5 giving the proofs of the main results. Sections 6 and 7 contain the proofs of two lemmas used to derive the main results.

2 Preliminaries
We shall use the notation κ Y (ζ) = C {ζ ‡ Y } = log Ee iζY to denote the cumulant (generating) function of a random variable Y . For a stochastic process , and by suppressing t we mean κ Y (ζ) = κ Y (ζ, 1), that is the cumulant function of the random variable Y (1). The class of supOU processes has been introduced by Barndorff-Nielsen in Barndorff-Nielsen (2001) as follows. Let Λ denote a homogeneous infinitely divisible random measure (Lévy basis) on R + × R and suppose that the cumulant function of the random variable Λ(A), where A ∈ B (R + × R), equals The measure m is called the control measure and it is the product m = π × Leb of a probability measure π on R + and the Lebesgue measure on R. Finally, κ L in (5) is the cumulant function κ L (ζ) = log Ee iζL(1) of some infinitely divisible random variable L(1) with Lévy-Khintchine triplet (a, b, µ) i.e.
The Lévy process L = {L(t), t ≥ 0} associated with the triplet (a, b, µ) is called the background driving Lévy process. It has independent stationary increments and thus, its finite-dimensional distributions depend only on the distribution of L(1).
The supOU process is a strictly stationary process X = {X(t), t ∈ R} given by the stochastic integral (Barndorff-Nielsen (2001)) By appropriately choosing the background driving Lévy process L, one can obtain any selfdecomposable distribution as a marginal distribution of X. Recall that an infinitely divisible random variable X is selfdecomposable if its characteristic function φ(ζ) = Ee iζX , ζ ∈ R, has the property that for every c ∈ (0, 1) there exists a characteristic function φ c such that φ(ζ) = φ(cζ)φ c (ζ) for all ζ ∈ R (see e.g. Sato (1999)). Equivalently, for every c ∈ (0, 1) there is a random variable Y c such that the random variable X has the same distribution as cX + Y c . Note that the one-dimensional marginals of the supOU process are independent on the choice of π. The probability measure π "randomizes" the rate parameter ξ in (7) and the Lebesgue measure ds is associated with the moving average variable s. The quadruple (a, b, µ, π) given in (1)

Basic assumptions
We now state a set of assumptions for the class of supOU processes we consider. The marginal distribution is assumed to be in the domain of attraction of some infinite variance stable law. The next assumption concerns the dependence structure controlled by the probability distribution π. Finally, Lévy measure µ is assumed to have a power law behavior near origin which will give rise to another parameter affecting the limiting behavior.

Dependence structure
The second set of assumptions deals with the dependence structure dictated by the behavior near the origin of the probability measure π in the characteristic quadruple (1). If variance is finite EX(t) 2 < ∞, then the correlation function of the supOU process X is the Laplace transform of π: Hence, by a Tauberian argument, the decay of the correlation function at infinity is related to the decay of the distribution function of π at zero (see (Fasen & Kluppelberg 2007, Proposition 2.6)). We will assume that the probability measure π is regularly varying at zero, that is for some α > 0 and some slowly varying function ℓ Note that α can take any positive value.
To simplify the proofs of some of the results below, we will assume that π has a density p which is monotone on (0, x ′ ) for some x ′ > 0, so that (14) implies Note that if variance of the supOU process is finite and α ∈ (0, 1), then the correlation function is not integrable, and the finite variance supOU process may be said to exhibit long-range dependence. On the other hand, note that the tail distribution of π does not affect the tail behavior of r(t), and in particular the decay of correlations. Hence it is not restrictive to assume in order to simplify the presentation of the results that

Behavior of the Lévy measure at the origin
Consider the Lévy measure µ in (6). Somewhat surprisingly, the limiting behavior of the integrated supOU process X * (t) is affected by the growth of the Lévy measure µ near the origin. We will quantify this growth by assuming a power law behavior of the Lévy measure near origin. Let denote the tails of µ. We will assume that there exist β ≥ 0, c + , c − ≥ 0, c + + c − > 0 such that Since µ is the Lévy measure, we must have β < 2. If (17) holds, then β is the Blumenthal-Getoor index of the Lévy measure µ defined by (see Grahovac, Leonenko & Taqqu (2019)) Note that by (Kyprianou 2014, Lemma 7.15 The condition (17) may be equivalently stated in terms of the Lévy measure of X(1). Indeed, if ν is the Lévy measure of X(1), then (17) is equivalent to

The basic decomposition
As stated in the introduction, we are interested in establishing the rate of growth of moments of the integrated process (2), measured by the scaling function τ X * defined by (3). The situation is more delicate than in the finite variance case since the range of finite moments is limited and the scaling function of the integrated process X * is well-defined only over the interval (0, q(X * )) = (0, γ).
To investigate the behavior of moments, we make a decomposition of the integrated process X * into components that have different limiting behavior. The decomposition is based on the Lévy-Itô decomposition of the background driving Lévy process L. Let where µ is the Lévy measure of the Lévy process L. Then we can make a decomposition of the Lévy basis into • Λ 2 with characteristic quadruple (0, 0, µ 2 , π), • Λ 3 with characteristic quadruple (0, b, 0, π).
Let L 1 (t), L 2 (t) and L 3 (t), t ∈ R denote the corresponding background driving Lévy processes so that we have the following cumulant functions: Note that L 1 is a compound Poisson process and L 3 is Brownian motion. Consequently, we can represent X(t) as with X 1 , X 2 and X 3 independent. Let X * 1 , X * 2 and X * 3 denote the corresponding integrated processes which are independent. We next investigate the scaling functions of each process X * 1 , X * 2 and X * 3 separately. These results will then be combined to give the scaling function of the integrated process.

The scaling function of
The process X * 1 has infinite moments of order greater than γ and its scaling function τ X * 1 is well-defined for q ∈ (0, γ). Following (Grahovac et al. 2018, Lemma 5.1 and 5.2), two processes may arise as a limit of X * 1 after normalization.
where k is the slowly varying function in (9), k # is the de Bruijn conjugate of 1/k x 1/γ and , and σ and ρ given by (10). Recall that the de Bruijn conjugate (Bingham et al. 1989, Subsection 1.5.7) of some slowly varying function h is a slowly varying function h # such that as x → ∞. By (Bingham et al. 1989, Theorem 1.5.13) such function always exists and is unique up to asymptotic equivalence. If, on the other hand γ > 1 + α, then as T → ∞ where ℓ # is de Bruijn conjugate of 1/ℓ x 1/(1+α) and the limit {L 1+α } is (1 + α)-stable Lévy and c − 1 , c + 1 given by We now consider convergence of moments in these limit theorems. First, if γ < 1 + α, then we get the following scaling function for the process X * 1 .
Lemma 4.1. If Assumption 1 holds and γ < 1 + α, then For moments of order q in the range (1 + α, γ) we are not able to obtain the exact form of the scaling function τ X * 1 (q). However, we provide a bound which will be enough for the proof of the main results later on. We conjecture that equality holds in (26).
Lemma 4.2. If Assumption 1 holds and γ > 1 + α, then The proofs of Lemma 4.1 and Lemma 4.2 are particularly delicate because of the presence of infinite second moments. They are given in Sections 6 and 7, respectively. Figure 1 shows the two forms of the scaling function of X * 1 .
Lemma 4.3. Suppose that Assumption 1 holds. Then the scaling function τ X * 2 (q) of the process X * 2 is as follows: (a) If α > 1, then where q * is the largest even integer less than or equal to 2α and q * is the smallest even integer greater than 2α.
(c) If α ∈ (0, 1) and 1 + α < β < 2, then Lemma 4.3(a) and convexity of the scaling function imply that for Note also that Lemma 4.3(a) implies that τ X * 2 (q) = q/2 for q ≤ 2 which will be enough for the proofs of Theorems 5.1 and 5.2 below.
In contrast with the component X * 1 , the scaling function of X * 2 displays intermittency in any case covered by Assumption 1. Even in the short-range dependent scenario α > 1, intermittency appears for higher order moments. Scaling functions of X * 2 are shown in Figure 2.

3
The process X * 3 defined in (21) is a Gaussian process. Its scaling function is given in (Grahovac, Leonenko & Taqqu 2019, Theorem 4.1 and 4.4). Gaussian supOU processes do not display intermittency and their scaling function is linear over positive reals (Figure 3). This result is stated here in Lemma 4.4.
Lemma 4.4. Suppose that Assumption 1 holds. Then the scaling function τ X * 3 (q) of the process X * 3 is as follows: (a) If α > 1, then 5 The scaling function of the integrated process X * To derive the scaling function of the integrated process X * = X * 1 + X * 2 + X * 3 we will use the expressions for the scaling functions of components in the decomposition (21) and the following proposition which shows how to compute the scaling function of a sum of independent processes.
Proposition 5.1. Let Y 1 = {Y 1 (t), t ≥ 0} and Y 2 = {Y 2 (t), t ≥ 0} be two independent processes with scaling functions τ Y 1 and τ Y 2 , respectively, and suppose that EY 1 (t) = EY 2 (t) = 0 for every t ≥ 0 if the mean is finite. If q ∈ (0, q(Y 1 )) ∪ (0, q(Y 2 )) and τ Y 1 (q) and τ Y 2 (q) are well-defined and positive, then the scaling function of the sum (c) α ∈ (0, 1) and 1 + α < β < 2 Proof. Suppose that max {τ Y 1 (q), τ Y 2 (q)} = τ Y 1 (q). For ε > 0 we can take t large enough so that From the inequality we have that where we used (27). Since ε was arbitrary, we conclude that We prove the reverse inequality for the q ≥ 1 case first. Note that in this case EY 1 (t) = EY 2 (t) = 0 for every t ≥ 0. For x ∈ R we have by using Jensen's inequality that Letting F Y 1 (t) and F Y 2 (t) denote the distribution functions of Y 1 (t) and Y 2 (t), respectively, we get by independence Suppose now that q < 1 and let Y ′ 2 = {Y ′ 2 (t), t ≥ 0} be an independent copy of the process Y 2 = {Y 2 (t), t ≥ 0}, independent of Y 1 . From (28) we have that we get by using (28) that Returning back to (29) we have Without loss of generality we may assume τ Y 1 (q) ≥ τ Y 2 (q). Assume first that this inequality is strict, namely that τ Y 1 (q) > τ Y 2 (q). For ε > 0 small enough we can take t large enough so that We conclude that E |Y 2 (t)| q E |Y 1 (t)| q → 0, as t → ∞.
By taking logarithms in (30), dividing by log t and letting t → ∞, we get If τ Y 1 (q) = τ Y 2 (q), we may have three cases as a limit of E |Y 2 (t)| q /E |Y 1 (t)| q as t → ∞: either the limit is 0, ∞ or some constant C > 0.
• If the limit is 0, then we can apply the same argument as in the case τ Y 1 (q) > τ Y 2 (q).
• If the limit is ∞, by interchanging the roles of Y 1 and Y 2 in the previous part of the proof we obtain the following analog of (30), namely • If the limit is C < 2 1−q , then 2 1−q − E|Y 2 (t)| q E|Y 1 (t)| q is eventually positive and logarithm can be applied in (30) to obtain the claim.
We are now ready to state the main results. We will show that infinite variance supOU processes may exhibit the phenomenon of intermittency. We first consider the case when the underlying supOU process has no Gaussian component (b = 0). The obtained scaling functions for this case are shown in Figures 4a-4d. Theorem 5.1. Suppose that Assumption 1 holds. Then the scaling function τ X * (q) of the process X * is as follows: (a) If α > 1 or if α ∈ (0, 1) and γ < 1 + α, then Proof. We shall combine the results of Lemmas 4.1, 4.2 and 4.3 by using Proposition 5.1.
(a) Suppose that γ < 1 + α and split cases depending on the scale function of X * 2 .
Note that the scaling function has a change-point in only two of the cases of Theorem 5.1. Hence intermittency appears only in cases (b) and (c) of Theorem 5.1 shown in Figures 5d and 5e, respectively.
One may notice that infinite order moments hide the intermittency property as they limit the domain of the scaling function. This can be seen in Figure 5. The finite variance component X * 2 exhibits intermittency in all cases, however, this is not always apparent from the scaling function of the process X * . This is due to the fact that infinite order moments may hide the behavior of the intermittent component. In these cases, the change point in the scaling function of X * 2 is to the right of the moment index γ, hence the scaling function of X * remains linear on (0, γ) (see Figures 5a, 5b, 5c and 5f).
We next state the result for the supOU process with Gaussian component (b = 0). The scaling functions for this case are shown in Figures 4e-4f.
• If γ > 1 + α, 1 + α < β and β ≤ γ, then since 1 − α β < 1 − α 2 and by the same argument as in the previous case. • The same argument applies to case γ > 1 + α, 1 + α < β and β > γ. Figures 6 and 7 illustrate the proof of Theorem 5.2. The scaling functions τ X * 1 , τ X * 2 and τ X * 3 of each component are shown on each plot and their maximum is denoted by the thick line. Figure 6 is related to the case (a) of Theorem 5.2 and Figure 7 to the case (b) of Theorem 5.2. The figures are split based on different forms of the scaling functions of the three components X * 1 , X * 2 and X * 3 . Note that if the Gaussian component is present, then the scaling function displays no intermittency. For example, even if the scaling functions of two components X * 1 and X * 2 have a change-point, this cannot be seen from the scaling function of X * due to infinite moments (see Figures 7c, 7d, 7e).
We consider now the symmetrized random variable X * 1 (T t). The characteristic function of From (7) we get the decomposition The equality of the integrals on the right-hand side follows from Since ∆X * 1,1 (T t) and ∆X * 1,2 (T t) are independent, we get Now we consider bounds for each term separately.
• For I 1 we make change of variables y = x (1 − g T (ζ, x, s)) and get where we used the fact that the integral in the last line is finite due to γ > 1 + α and the choice of η and δ.