Frequency domain bootstrap methods for random fields

Abstract: This paper develops a frequency domain bootstrap method for random fields on Z2. Three frequency domain bootstrap schemes are proposed to bootstrap Fourier coefficients of observations. Then, inversetransformations are applied to obtain resamples in the spatial domain. As a main result, we establish the invariance principle of the bootstrap samples, from which it follows that the bootstrap samples preserve the correct second-order moment structure for a large class of random fields. The frequency domain bootstrap method is simple to apply and is demonstrated to be effective in various applications including constructing confidence intervals of correlograms for linear random fields, testing for signal presence using scan statistics, and testing for spatial isotropy in Gaussian random fields. Simulation studies are conducted to illustrate the finite sample performance of the proposed method and to compare with the existing spatial block bootstrap and subsampling methods.


Introduction
Following Efron's influential paper ( [8]), development of bootstrap resampling procedures has been growing rapidly. Bootstrap resampling constitutes a powerful tool for approximating certain characteristics of a statistic, that cannot be easily calculated by analytical means. In addition, bootstrap methods require no explicit knowledge of the underlying dependence mechanism, or the marginal distribution of the observations. These user-friendly features make bootstrap resampling popular for statistical inference.
In recent decades, various resampling methods for dependent data have been proposed. For time series data, block bootstrap and frequency domain bootstrap are two important classes of bootstrap procedures. For block bootstrap methods, include moving block bootstrap ( [25] and [29]), non-overlapping block bootstrap ( [3]), circular block bootstrap ( [41]), and stationary block bootstrap ( [43]). Despite its simplicity, the accuracy of a block bootstrap estimator critically depends on the block size employed. On the other hand, frequency domain bootstrap methods use the periodogram of the data to derive bootstrap approximations for a class of estimators called ratio statistics, see [5], [10] and [22] for details. [21] proposed the time frequency toggle (TFT) bootstrap for time series, which directly resamples the discrete Fourier transform instead of resampling the periodograms. Unlike periodograms, the bootstrapped discrete Fourier transforms can be transformed back to generate bootstrap resamples of a time series. Thus, TFT bootstrap not only comprises the classical frequency domain bootstrap methods, but is also applicable to statistics that are based on the time domain representation of the observations, including the CUSUM statistic for change-point detection, and the least-squares statistic for unit-root testing. By combining a time domain parametric bootstrap and a frequency domain nonparametric bootstrap, [18] extended the autoregressive aided periodogram bootstrap suggested by [22] and proposed a multiple hybrid bootstrap for linear processes which can generate bootstrap resamples in the time domain. For reviews of resampling methods in time series, see [2], [36], [26], and [38].
Apart from time series, subsampling and resampling methods for spatial data have also become increasingly popular in past decades; see [6] for a brief overview. [16] used a block resampling procedure to bootstrap spatial data. [44] developed a subsampling method for random fields. [42] considered a block bootstrap method for homogeneous strong mixing random fields. [46] used a resampling method to estimate variance for statistics computed from spatial data. [40] proposed subsampling methods for statistical inference in irregularly spaced dependent observations. [28] used spatial subsampling for least squares variogram estimation. [34] and [35] developed the optimal block sizes for spatial subsampling and bootstrap methods. However, they are only applicable to variance estimation. [30] proposed a bootstrap method for Gaussian random fields under fixed domain asymptotics. See [26] for a comprehensive review. Recently, [32] proposed an AR sieve bootstrap for linear random fields. To the best of our knowledge, the development of spatial bootstrap methods focuses mainly on the block bootstrap type methods, and a frequency domain bootstrap method for possibly nonlinear random fields remains absent from the literature.
In this paper, we develop a frequency domain bootstrap method for random fields on Z 2 . The basic principle of the proposed method is to bootstrap Fourier coefficients of observations, and then inverse-transform the resampled Fourier coefficients to obtain bootstrap samples in the spatial domain. By resampling the discrete Fourier transforms instead of the periodograms, we can handle situations where the statistics of interest are not expressible by periodograms, such as scan statistics for testing presence of spatial signal, see Section 6. The proposed frequency domain bootstrap method is similar in spirit to the TFT bootstrap of [21] for time series. However, resampling the Fourier coefficients in spatial data is not as straightforward as that in time series due to an additional rotational symmetry of the coefficients. In addition, applications of the spatial frequency domain bootstrap method, such as testing for signal presence and testing for spatial isotropy, are very different from applications of the time series counterpart, such as change-point detection and testing for unit roots. Moreover, to develop the bootstrap theory in spatial context, we establish an invariance principle for the bootstrap partial sum process indexed by a classical example of Vapnik-Chervonenkis-classes (V C-classes) of set [0, 1] 2 . The results can be generalized to other V C-classes. The proofs of the asymptotic results require different ideas and techniques compared with that for the time series counterpart in [21].
We propose three resampling schemes for bootstrapping the Fourier coefficients of spatial processes in Z 2 . We show that the resulting bootstrap sample correctly captures the second-order moment structure for a large class of random fields. The results are illustrated by applications to constructing confidence intervals of correlograms for linear random fields, testing for the presence of signal, and testing for spatial isotropy in Gaussian random fields. Simulation studies are performed to explore the finite sample performance of the proposed method and to compare with existing spatial block bootstrap and subsampling methods.
This paper is organized as follows. Section 2 provides the problem setting and reviews the spectral theory for spatial processes in Z 2 . In Section 3, three resampling schemes for the Fourier coefficients are proposed to develop bootstrap procedures for spatial processes in Z 2 . The main results are presented in Section 4, in which we establish the validity of the bootstrap procedures by showing the invariance principles of bootstrap samples under some meta-assumptions on the bootstrapped Fourier coefficients. Section 5 verifies these meta-assumptions for the three resampling schemes. In Section 6, we introduce some practical applications for the proposed bootstrap method, and simulation studies on comparing the proposed method with existing spatial block bootstrap and subsampling methods are given. Technical proofs of the theorems and lemmas are provided in Appendices A and B.

Problem setting and spectral theory for spatial processes in Z 2
In this section, we describe the problem settings and preliminary about the spectral theory for spatial processes. First, we introduce some notations. For any vector a = (a 1 , a 2 , . . . , For any set G, denote the cardinality of G by |G|. For random variables X ∈ L p , denote the L p norm as X p = (E(|X| p )) 1/p . For any two sequences of real numbers {a n } and {b n }, denote by a n b n when a n = O(b n ) and b n = O(a n ). For any x ∈ R, x is the greatest integer that is less than or equal to x. All vectors are column vectors unless specified otherwise, hence for any a = (a 1 , a 2 , . . . , a q ) ∈ R q and b = (b 1 , b 2 , . . . , b q ) ∈ R q , the dot product between vectors a and b is defined as the vector multiplication a

Settings and assumptions
Let V (t) : t ∈ Z 2 be a stationary random field on a two-dimensional grid with mean μ = E(V (0)). Assume that we have observed {V (t) : t ∈ T } on a rectangular spatial region We impose the following assumptions about the increasing domain asymptotic framework and the underlying random fields for establishing the asymptotic results.
Assumption A.2. The random field V (t) : t ∈ Z 2 is stationary with absolutely summable auto-covariance function γ(·), i.e., j∈Z 2 |γ(j)| < ∞, where γ(j) = Cov(V (0), V (j)). In this case the spectral density of the random field exists and can be expressed as where G(·) is a measurable function and {ε i } i∈Z 2 is an i.i.d. random field. Let { ε i } i∈Z 2 be an i.i.d. copy of {ε i } i∈Z 2 . Define the coupled version of V (j) as Assume that there exists some p > 0 such that V (j) belongs to L p and Δ p := is the p-stable condition for random fields defined in [9] in which central limit theorems and invariance principles are established for a wide class of stationary random fields. We will discuss the invariance principles in detail in Section 4. The next assumption is a geometric-moment contraction (GMC) condition: Assume that there exist α > 0, C > 0 and 0 < ρ = ρ(α) < 1 such that for all Assumption A.5 is the spatial extension of the geometric-moment contraction condition for time series, see [45]. This condition is fulfilled for short-range dependent linear random fields with finite variance, and a large class of nonlinear random fields such as nonlinearly transformed linear random fields, Volterra fields and nonlinear spatial autoregressive models, see [9] and [7].

Fourier coefficients of spatial processes in Z 2
Denote the sample mean as V T = |T | −1 t∈T V (t), and the centered observa- . Note that the dependence of Fourier coefficients x(j) and y(j) on T is suppressed for notational simplicity. The basic principle of the proposed bootstrap method is to sample the Fourier coefficients of the observations, and then back-transform them to obtain bootstrap samples in the spatial domain. First, we discuss some structural properties of the Fourier coefficients of spatial processes in Z 2 . Note from By using the symmetry property in (2.3), we now partition T as T = N ∪Ñ ∪ M such that the Fourier coefficients defined onÑ are determined by the Fourier coefficients defined on N . Also, the information about the covariance structure and mean of the random field are contained in N and M respectively. Hence, a spatial process can be reconstructed from the Fourier coefficients defined on N and M . In particular, when d 1 , d 2 are both odd, define When d 1 is odd and d 2 is even (similar for d 1 is even and d 2 is odd), define When d 1 , d 2 are both even, define Then, using the symmetry property in (2.3), subsetÑ of T is defined as Note that the Fourier coefficients at j ∈Ñ can be completely determined by the Fourier coefficients at j ∈ N . From (2.2), for all c ∈ R, the Fourier coefficients of {V (t) − c : t ∈ T } at j ∈ N are the same. In other words, the Fourier coefficients in N andÑ are invariant under additive constants and thus contain no information about the mean. In contrast, all of the information about the mean is contained in the Fourier coefficients (2.4) Table 1 shows some examples to illustrate the partitions of set T under different scenarios. Table 2 summarizes the value of the Fourier coefficients in M . Since in spatial statistics the main concern is the covariance structure of the random field, we focus on bootstrapping the Fourier coefficients in N . The issue of bootstrapping spatial mean is deferred to Section 4.2.3.

Table 2
Fourier coefficients in M contain information about the mean.

Kernel spectral density estimation
We consider the kernel spectral density estimator is the periodogram at frequency λ j . The periodogram can be set to 0 on D since it only contains information about the mean. We impose the following mild regularity assumptions on the kernel function K(·).
Assumption K.4. The quantity K h (λ) in (2.6) satisfies the following uniformly Lipschitz condition: for some constant L K > 0, and where k(·) is defined in (2.7). From the above representations it is clear that for large T , for bounded K(·). By (2.8), if the kernel K(·) is uniformly Lipschitz continuous with compact support, then Assumption K. 4 holds for a small enough h T = (h T 1 , h T 2 ). For infinite support kernels, if K(·) is bounded and continuously differentiable, then Assumption K.4 also holds. Assumptions K.1 to K.4 hold for many commonly used kernels such as uniform kernels K(λ 1 , ).

Frequency domain bootstrap
In this section, we propose three bootstrap schemes in the frequency domain, namely Residual-Based Bootstrap (RB), Wild Bootstrap (WB) and Local Bootstrap (LB), for resampling the Fourier coefficients. Similar bootstrap schemes in time series context are first proposed by [10], [37], and [17] respectively. A bootstrap procedure that produces resamples of spatial processes is then developed in Section 3.2.

Residual-based bootstrap (RB)
In RB, we first standardize the Fourier coefficients to obtain a set of residuals, which consists of approximately i.i.d. normal random variables. Hence, i.i.d. resampling methods can be applied to yield a resample of Fourier coefficients.
Step 1: Estimate the spectral density f by f T , which satisfies Step 2: For the Fourier coefficients x(j) and y(j), j ∈ N , define , s j,2 = y(j) .
For j ∈ N and k = 1, 2, define the residuals s j,k by standardizing s j,k as Note that the residuals s j,k are approximately independent standard normal variables, see Theorem 4.1 of [33].
Step 4: Define the bootstrapped Fourier coefficients by where j ∈ N .

Wild bootstrap (WB)
Compared to RB, the WB further exploits the asymptotic normality of the Fourier coefficients by generating independent standard normal random variables instead of resampling the residuals.
Step 1: Estimate the spectral density f by f T , which satisfies (3.1).
Step 2: Define the bootstrapped Fourier coefficients by where {G j,k : j ∈ N, k = 1, 2} are independent standard normal random variables. For RB and WB, conditions for kernel spectral density estimators satisfying (3.1) can be found in [33] for a large class of random fields.

Local bootstrap (LB)
In contrast to RB and WB, LB does not require any spectral density estimation. Instead, LB makes use of the smoothness of spectral density, which ensures that in a neighborhood of each frequency, the distributions of the Fourier coefficients are nearly identical. Therefore, replicates of the Fourier coefficients can be produced by directly resampling the Fourier coefficients within neighborhoods.
Step 1: Select a symmetric, nonnegative kernel K(·) that satisfies Assumptions Step 3: For j ∈ N , define the uncentered bootstrapped Fourier coefficients by resampling within a neighborhood of j.
Step 4: Define the centered bootstrapped Fourier coefficients by

Bootstrap procedure for spatial processes
With the three bootstrap schemes for the Fourier coefficients, we develop the bootstrap procedure for resampling spatial processes as follows: Step 1: Compute the Fourier Coefficients x(j), y(j) for j ∈ T using Fast Fourier Transform (FFT).
Step 4: Set bootstrapped Fourier coefficients onÑ according to the symmetric Step 5: Use the inverse FFT algorithm to transform the bootstrap Fourier The resulting bootstrap spatial process {Z * (t) : t ∈ T } is real-valued and centered, and can be used for inference on a large class of statistics that are based on partial sums of the centered process {Z(t)}; see Section 6 for examples. Note that since the Fourier coefficients in N can also be uniquely determined by the Fourier coefficients inÑ , it would be technically the same if we interchange the roles ofÑ and N in the above bootstrap procedure. That is, we can first obtain a bootstrap sample onÑ instead of N in Step 3, and then set bootstrapped Fourier coefficients on N instead ofÑ in Step 4 using the rotational symmetry in (2.3).

Remark 3.1.
To compare the computational cost of the proposed frequency domain bootstrap methods and the existing spatial block bootstrap method, we first investigate the number of random data required to be simulated for generating one bootstrap spatial resample. The classical block bootstrap with a block size m 1 ×m 2 typically requires a simulation of ( d 1 /m 1 +1)( d 2 /m 2 +1) random data to generate one bootstrap spatial resample, which is with order of O(|T |/(m 1 m 2 )). On the other hand, the proposed frequency domain bootstrap methods need to simulate 2|N | random data for RB and WB and 4|N | random data for LB to generate one bootstrap spatial resample, which are of order O(|T |). Hence, the computational cost for generating resamples in the classical block bootstrap method is smaller compared to that of the proposed methods as the block size m 1 × m 2 diverges. However, in practice the computational complexity for evaluating the test statistics under consideration is O(|T |) and hence the computational complexity for conducting bootstrap inference, for example constructing bootstrap confidence intervals, is O(B|T |), where B is the number of bootstrap replications. Thus, the computational cost of both methods are essentially of the same order O(B|T |).

Main results
In this section, we first review the invariance principles of the partial sum process of a random field. Then, we present the main results of the paper: the invariance principles of the partial sum process of the bootstrap sample (Theorem 4.3), and the validity of the bootstrap methods (Corollaries 4.4 and 4.5).

Invariance principles for random fields
To facilitate applications to different situations, we consider the following collection of Borel subsets of [0, 1] 2 as the index set of the partial sum process: The class Q 2 is a classical example of Vapnik-Chervonenkis-classes (V C-classes), with V C-index equal to 5; see Section 2.6 of [48].
We equip the class Q 2 with the pseudo-metric ρ( is a linear mapping that translates and rescales [0, 1] 2 to E.
The following lemmas by [9] give the invariance principles of the partial sum processes

Invariance principles for bootstrap samples
This section contains the main result: the invariance principles for the partial sum process of the bootstrap sample. This result implies the validity of the bootstrap methods in case the invariance principles are involved in the asymptotic of the underlying test statistics. To facilitate further extensions to bootstrap schemes other than RB, WB and LB, the results are formulated in a general way under some meta-assumptions on the resampling scheme. In Section 5, we verify the meta-assumptions for the RB, WB and LB schemes.

Assumptions on the bootstrapped Fourier coefficients
Denote L * , E * , Var * , Cov * , and P * as the bootstrap distribution, expectation, variance, covariance and probability, conditional on the data, respectively. Moreover, let {·|V (·)} denote conditioning on the data.
Assumption B.2. Uniform convergence of the variances of the bootstrapped Fourier coefficients: Assumption B.3. There exists some p > 8 such that the p-th moments of the bootstrapped Fourier coefficients are uniformly bounded: The Mallows distance on the space of all real Borel probability measures with finite variance is given by where the infimum is taking over all random variables X 1 and X 2 with marginal distributions L 1 and L 2 , respectively. Convergence in Mallows distance implies convergence in distribution and convergence in the second moment, see [31].
Assumption B.4. The probability distributions of the bootstrapped Fourier coefficients converge uniformly in the Mallows distance to the same limit as the Fourier coefficients do, i.e.,

Asymptotic results on bootstrap samples
The following lemma asserts that the bootstrap sample {Z * (·)} and the corresponding partial sum process have correct auto-covariance structures.

(a) For any
(c) If Assumptions A.2 and B.2 also hold, then for any fixed l 1 , l 2 ∈ T , For any rectangular region T , denote the Q 2 -indexed partial sum processes of the bootstrap sample {S * The following theorem establishes the invariance principles for the Q 2 -indexed partial sum processes of the bootstrap samples.

Theorem 4.3. Suppose that Assumptions
The following corollary states that the bootstrap samples preserve the secondorder dependence structure of the random field asymptotically. In particular, if the underlying random field is Gaussian, then the proposed bootstrap procedure produces asymptotically valid approximation of the centered random field

Bootstrapping the mean
The bootstrap sample {Z * (t) : t ∈ T } obtained from Section 3.2 is real-valued and centered. In order to obtain a non-centered bootstrap process, we may employ a separate bootstrap procedure independent of {Z * (·)} to acquire a bootstrapped mean μ * T . For details on bootstrapping the mean μ * T , see [16], [42], [46], [26], and [35]. Then, the non-centered bootstrap process V * (·) = Z * (·)+μ * T gives a bootstrap approximation of V (·). Note that {Z * (·)} contains information about the covariance structure of the spatial process and μ * T contains information about the mean level. The following corollary shows that the non-centered bootstrap sample V * (·) has the same asymptotic behavior as the original spatial process in terms of the partial sum process.
where Φ(·) denotes the standard normal distribution function. Then, it holds in probability that Possible generalizations of asymptotic results in Section 4 to other V C-classes with V C-index equal to V as index sets can be established with p > 2(V − 1) moment conditions, see Theorem 2(i) of [9] for the invariance principles indexed by V C-classes. For example, the class is a classical example of V C-classes, with V C-index equal to 3, and the above asymptotic results hold when the moment condition with p > 4 in Assumptions A.4(p) and B.3 hold.
Remark 4.1. From Theorem 4.3 and Corollary 4.5, the proposed frequency domain bootstrap method can mimic the second-order dependence structure of the random field asymptotically. Hence, the proposed method is applicable to statistics of interest which depend asymptotically on second order dependence structure. In Section 6, we discuss applications of the proposed bootstrap method to confidence intervals construction of correlograms for linear random fields, testing for signal presence in random fields, and spatial isotropy test for Gaussian random fields. The validities of the proposed bootstrap method for the above applications are also theoretically investigated.

Validity of meta-assumptions for the resampling schemes RB, WB, and LB
In this section, we prove the validity of the bootstrap schemes RB, WB, and LB under some conditions on the spatial processes. We also give conditions under which the bootstrap schemes remain valid when the bootstrap methods are applied to an estimated field { V (t) : t ∈ T } rather than the observed field {V (t) : t ∈ T }.   t ∈ Z 2 }, i.e., V (j) = s∈Z 2 a s ε j−s with |a j | ≤ Cρ j for some ρ ∈ (0, 1) and C > 0, satisfies Assumptions A.4(p) with p > 8 and A.5 with E |V (0)| 16 Example 5.2 (Volterra Fields). Volterra Fields is a class of nonlinear random fields which plays an important role in the nonlinear system theory. Let {ε t } t∈Z 2 be an i.i.d. random field with E(|ε 0 | p ) < ∞ for some p ≥ 32. Consider the second order Volterra process V (t) : t ∈ Z 2 , where {a s1,s2 } are real coefficients with a s1,s2 = 0 if s 1 = s 2 . Then, by Rosenthal inequality, there exists a constant C p > 0 such that where A j = s1,s2∈Z 2 (a 2 s1,j + a 2 j,s2 ) and B j = s1,s2∈Z 2 (|a s1,j | p + |a j,s2 | p ). Thus, if a s1,s2 = O(ρ max{ s1 , s2 } ) for some ρ ∈ (0, 1), then δ j,p = O(ρ j ), and Assumptions A.4(p) with p > 8 and A.5 with E |V (0)| 16 < ∞ hold.
In many applications, the bootstrap methods are not applied directly to stationary spatial data {V (t)}, but to an estimate { V (t)} from spatial data {Y (t)}, see, for example, the testing for signal presence using scan statistics in Section 6.2. The following corollary gives conditions for the validity of the bootstrap schemes in this situation.
Remark 5.1. For random fields exhibiting complex non-linear trends, the proposed bootstrap procedure requires some modifications. Specifically, assume that the underlying random field {Y (t)} can be modeled by Y (t) = μ(t) + V (t) for t ∈ T , where μ(t) is a non-linear trend, and V (t) is a zero-mean random field which satisfies the conditions stated in the Section 2. From Corollary 5.2, the proposed bootstrap method remains valid for the field { V (t)} estimated from the spatial data {Y (t)} under some conditions on the decay rate α T of the average squared error. Hence, to apply the proposed bootstrap method, we can proceed as follows. First, we apply local smoothing or kernel methods to estimate the trendμ(t), and then an estimated field { V (t)} can be obtained by The proposed bootstrap method can be applied on { V (t)} to get a centered bootstrapped sample {Z * (t)}, and then employ a separate bootstrap procedure independent of {Z * (t)} to acquire a bootstrapped mean μ * T , and get a non-centered bootstrap field V * (t) = Z * (t) + μ * T as illustrated in Section 4.2.3. Finally, a bootstrapped sample Y * (t) =μ(t) + V * (t) can be obtained.

Remark 5.2.
To implement the proposed frequency domain bootstrap methods, we need to specify one tuning parameter, the bandwidth h T = (h T 1 , h T 2 ) ∈ R 2 . From Theorem 5.1, the bandwidths have to satisfy some decay rate conditions. To be precise, we require |h T | = O(|T | −η ) for some 0 < η < 1/2 for RB and WB, and |h T | → 0 and (|h T | 4 |T |) −1 → 0 for LB. For example, |h T | = O(|T | −1/5 ) works for all three methods. To provide a more precise guideline for the choice of h T , in Section 6.1, we first conduct a sensitivity analysis for a wide range of bandwidths, and then select the one with the best coverage of the confidence intervals. As the bandwidths in RB and WB are for the kernel spectral density estimation, we can also employ the adaptive bandwidth selection proposed in [39] or [19]. Although no theoretically supported optimal bandwidth selection method is available for LB, the bandwidth obtained from the adaptive bandwidth selections for RB and WB can be shown to satisfy the required conditions for LB asymptotically. In Section 6, we apply the same range of bandwidths in RB, WB and LB, and similar results occured in all three methods. It indicates that the same suitable bandwidths for RB and WB may also be appropriate for LB. For more discussions in bandwidth selection in local bootstrap in time series context, see [37].

Applications and simulation studies
In this section, we demonstrate applications of the proposed bootstrap procedures to constructing confidence intervals for correlograms for linear random fields, testing for signal presense using scan statistics, and testing for spatial isotropy of Gaussian random fields. We also perform numerical studies to compare the proposed bootstrap methods with existing methods including the spatial block bootstrap and spatial subsampling methods. Unless specified otherwise, in all of the simulation experiments, we consider random fields {V (·)} on a 50 × 50 region T , i.e., d 1 = 50, d 2 = 50 and |T | = 2500. In addition, the number of bootstrap samples is set as 1000. Moreover, the Gaussian kernel K(λ) = φ(λ) is employed for the kernel spectral density estimation in RB and WB, and the smoothing function in LB, where φ(·) is the bivariate standard normal density function. For the spatial block bootstrap, we employ the overlapping block bootstrap ( [26]). For the spatial subsampling method, we use the overlapping subblocks subsampling ( [15]) with the suggested subblock size d

Confidence intervals construction of correlograms for linear random fields
One major application of frequency domain bootstrap in linear time series is on ratio statistics such as sample autocorrelation functions; see [5] and [21]. Analogously, for spatial statistics, the proposed frequency domain bootstrap method can be applied to construct confidence intervals of the spatial correlograms for linear random fields. Consider a random field V (t) : t ∈ Z 2 with covariance function C(h). The correlogram at lag t is defined as the ratio statistic ρ(t) = C(t)/C(0). A natural estimator of the correlogram is given bŷ s∈T V (s) and T (t) = {s : s, s + t ∈ T }, is the method-ofmoments estimator of the covariogram at lag t. To apply the proposed bootstrap methods, generate B bootstrap samples from either RB, WB or LB. For the i-th bootstrap sample, we compute the correlogram estimatorρ * (i) (t). Then, confidence intervals can be constructed from the sample quantiles of the correlogram estimates of the resamples.
To illustrate the construction of bootstrap confidence intervals for correlograms, consider real-valued mean-zero Gaussian random fields on Z 2 with a Gaussian covariance function where σ 2 is the partial sill parameter, φ is the range parameter and η is the nugget effect. First, we consider the model (η, σ 2 , φ)=(0, 1, 1) to investigate the choice of the bandwidth parameter h T for RB, WB and LB schemes, and the choice of block size for the Block Bootstrap (BB). For each resample of data, we compute the correlogram estimatesρ(t) at a range of lags: t = (1, 0), (0, 1), (1, 1), (2, 0) and (0, 2). Then, for each lag, a 95% confidence interval is constructed from the sample quantiles of the correlogram estimates of the resamples. The above procedure is repeated 1000 times to investigate the coverage accuracy of the confidence intervals. The results are summarized in Figure 1. It can be seen that the bandwidths h T = (0.05, 0.05), (0.11, 0.11), or (0.15, 0.15) give good performance for all of the proposed RB, WB, and LB schemes. For BB, block sizes 4 × 4, 7 × 7, and 13 × 13 are recommended.
Next, we consider the models (η, σ 2 , φ)=(0, 1, 0.5), (0, 1, 1), (1, 1, 0.5), and (1, 1, 1) to explore the effect of various decay rates of spatial dependency, and the presence of the nugget effect. For each model, a 95% confidence interval is constructed for each lag based on the sample quantiles of the bootstrapped correlogram estimates. Again, 1000 replications are performed to investigate the coverage accuracy of the confidence intervals. The results are summarized in Tables 3 and 4. It can be seen that the coverage accuracy of the proposed bootstrap methods is much closer to the nominal level of 95% than that of the block bootstrap method. On the other hand, the coverage accuracy of the block bootstrap method is not stable in the sense that it ranges from 50% to nearly 100% coverage under various models.

Testing for signal presence in random fields
In this subsection, we consider the problem of detecting a deterministic signal against a noisy background. This problem has received considerable attention and has profound applications in epidemiology, astronomy, and biosurveillance. The standard statistical tool to address this problem is the spatial scan statistic; see [24], [13], [14], [12], and [4]. Consider the observations {Y (t) : t ∈ T } given by where T ∈ Z 2 is a rectangular region, {V (t) : t ∈ T } is a zero-mean process, and I T ⊂ T is the location of a deterministic signal with magnitude s. We assume that I T is sufficiently large in the sense that where C I T ⊂ [0, 1] 2 is a circle with radius r I T > 0, and g T is the linear mapping defined in Section 4.1. Let Z T = {g T (A) : A ∈ Q 2 } be the collection of all possible rectangular subsets of T , and μ A = 1 |A| t∈A E (Y (t)). To determine whether a signal exists, we consider the hypotheses where μ 0 ∈ R. To test between H 0 and H 1 , the scan statistics , is a natural candidate; see Theorem 1 of [4] in time series context, and [23] and [49] in spatial context. In particular, H 0 should be rejected for a large SS T .
To determine the critical value of the test statistics SS T , the bootstrap methods can be applied to the locally demeaned spatial data The following theorem asserts that, by using the locally demeaned data, the frequency domain bootstrap methods asymptotically yield the null distribution even under the presence of a signal.
where P 0 is the probability measure under H 0 with s = 0, and P * is the conditional probability measure given {Y (t) : t ∈ T } using any bootstrap method satisfying Assumptions B.1 to B.4.
The following theorem proves the consistency of the bootstrap test.
where P 1,s is the probability measure under H 1 with signal magnitude s, and c * is the critical value determined by any bootstrap method satisfying Assumptions B.1 to B.4, i.e., P * (SS * T ≥ c * ) = α with significance level α. The following simulation experiments evaluate the finite sample performance of the bootstrap methods on testing for the presence of spatial signal. We generate real-valued zero-mean non-Gaussian random fields V (·) using point-wise transformation of homogeneous Gaussian random fields. First, we generate realvalued mean-zero Gaussian random fields X(·) on a 50 × 50 region T using the 1, 1, 1). Then, for each t ∈ T we transform X(t) to a non-Gaussian , where F N is the cumulative distribution function of the standard normal distribution and F −1 R is the inverse distribution function of a centered distribution R. In our simulation, Student's t (20) distribution is used. The signal location I T is taken as a 8 × 8 square grid at the center of T . Let Z T = Q 2 be the collection of all possible rectangular regions of T . For simplicity, let Z T contain all rectangular regions A ⊂ T with a fixed size of 10 × 10.
We investigate the size and power of the bootstrap test for signal detection under different magnitudes: s = −3, −2, −1, 0, 1, 2, and 3. The window width w T = 10 is used to compute the locally demeaned data. We compute the scan statistic for each bootstrap sample and compute the critical values from the quantiles of the bootstrapped scan statistics. Table 5 reports the rejection rate of the test using block bootstrap, RB, WB, and LB under various values of s. Different block sizes 4 × 4, 7 × 7, and 13 × 13 are used for block bootstrap, and a wide range of bandwidths, h T = (0.05, 0.05), (0.1, 0.1), (0.15, 0.15), (0.2, 0.2), and (0.25, 0.25) are used for RB, WB, and LB for evaluating the effect of bandwidth selections on the performance. It can be seen that the performance of RB, WB, and LB is superior to that of the block bootstrap, and robust to the choice of the bandwidth h T . One possible reason for the good performance of frequency domain bootstrap methods is that the frequency domain bootstrap samples have a constant mean of zero; see Lemma 4.2. On the other hand, even though the locally demeaned data are used, the block bootstrap samples may still occasionally contain regions which largely deviate from zero, which affects the performance of the test. From Table 5, the effect of this phenomenon magnifies with an increase in block size.

Spatial isotropy test for Gaussian random fields
In this subsection, we study the application of the proposed bootstrap methods to test for spatial isotropy of Gaussian random fields, i.e., the covariance between two sites depends on their distance but not direction. Since the asymptotic distributions of spatial covariances and variograms depend on the fourth order structure, Gaussian assumption is needed for the random fields in this application. Since it is difficult to exhaust all possible distances and directions, [15] considered the null hypothesis of isotropy as H 0 : 2γ(t i ) = 2γ(t j ), ∀t i , t j ∈ Λ, t i = t j , and t i = t j , where t = √ t t, Λ = {t 1 , . . . , t m } is a prespecified set of sites, and 2γ(t) = E(V (0) − V (t)) 2 is the variogram at lag t. Let G = (2γ(t 1 ), . . . , 2γ(t m )) be a vector of variograms in Λ. Observe that, under H 0 , there exists a full row rank matrix A such that AG = 0. For example, if Λ = {(1, 0), (0, 1)}, then G = (2γ(1, 0), 2γ(0, 1)) , and we may set A = [1 − 1]. Based on this observation, [15] derive the test statistic whereĜ T = (2γ(t 1 ), . . . , 2γ(t m )) is the sample variogram vector that estimates G, is the estimator of the variogram at lag t, T (t) = {s : s, s + t ∈ T }, andΣ R is a consistent estimator of Σ R , the covariance matrix of the sample variogramŝ G T . Under H 0 and some regularity conditions, Theorem 1 of [15] states that where d is the row rank of A. However, the convergence of the test statistic appears to be slow. Therefore, [15] consider a subsampling method to determine the p-value of the test. In the following, we consider using the proposed frequency domain bootstrap method to determine the p-value of the test.
Following the simulation study in [15], we employ a mean-zero Gaussian random field on Z 2 with a spherical covariance function otherwise , (6.5) where σ 2 is the partial sill parameter, φ is the range parameter, η is the nugget effect, and r = √ h Bh is related to a geometric anisotropy transformation. Specifically, given an anisotropy angle ψ A and anisotropy ratio ψ R , define the rotation matrix R and shrinking matrix T as then B = R T T R is a 2 × 2 positive definite matrix representing a geometric anisotropy transformation. A random field with spherical covariance function (6.5) is in general anisotropy except that it is isotropy when ψ R = 1. In addition, if ψ A = 0, then the main anisotropic axes are aligned with the (x, y) axes. See, for example, Section 5.1 of [47] for details. For the Gaussian process, it can be shown that the covariance function (6.5) satisfies the absolute integrability condition, which is sufficient for Theorem 1 of [15] to hold. We consider model parameters (η, σ 2 , φ, ψ A , ψ R ) = (2, 3, 4, 0, ψ R ) for different anisotropy ratio ψ R . Also, set Λ = {(1, 0), (0, 1)}, G = (2γ(1, 0), 2γ(0, 1)) , 2γ(1, 0), 2γ(0, 1)) . Thus, the test statistics (6.3) becomes whereΣ R may be estimated by subsampling or by the proposed bootstrap methods. However, since (AΣ R A ) −1 is only a normalizing factor in (6.6), we may focus on subsampling and bootstrapping the test statistic Next, we briefly outline the subsampling and bootstrap methods. For the spatial subsampling, the region T is divided into k T small overlapping subblocks, known as subsampling windows, which are congruent to T in both configuration and orientation. Denote T i sub , as the i-th subblock. For each of the k T subblocks, compute the statistic whereγ i sub (t) is defined similarly as in (6.4), but with T (t) replaced by T i sub (t) = {s : s, s+t ∈ T i sub }. Using the T S i T,sub s, the p-value for the test can be calculated by

and the null hypothesis is rejected if the p-value is smaller than the significance level α.
For the proposed bootstrap methods, B bootstrap samples are generated from either RB, WB or LB. For the i-th bootstrap sample, we compute the variogramγ i boot (1, 0) using (6.4). Next, define the variogram difference V D i = 2γ i boot (1, 0) − 2γ i boot (0, 1), and the bootstrapped test statistic Note that centering of V D i s is required since V D i has a non-zero mean under the alternative. Similar to the test for signal presence, this centering procedure allows the bootstrapped test statistic to converge to the null distribution even under the alternative hypothesis; see Theorem 6.3 below. Finally, the p-value of the test can be calculated by T ,boot ≥T S T } /B, and the null hypothesis is rejected if the p-value is smaller than the significance level α.
The following theorem states that the bootstrapped test statistic converges to the same limit as that of T S T , and hence the proposed bootstrap method is valid.

Theorem 6.3. For a stationary Gaussian random field {V (t) : t ∈ T }, under the assumptions of Theorem 1 in [15], we have
where P 0 denotes the probability measure under H 0 , and P * denotes the conditional probability given {V (t) : t ∈ T } using any bootstrap method satisfying Assumptions B.1 to B.4. Table 6 summarizes the rejection rates of the bootstrap test by spatial subsampling, RB, WB and LB under different values of anisotropy ratio ψ R . It can be seen that the performance of the proposed methods is superior to that of spatial subsampling in both size and power.

Conclusion
This paper develops a frequency domain bootstrap method for random fields on Z 2 . Three bootstrap schemes for resampling the Fourier coefficients are proposed. Inverse-transformations are then applied to obtain resamples in the spatial domain. The resulting bootstrap resamples capture the correct second-order moment structure for a large class of random fields. Moreover, invariance principles of the partial sum process indexed by a classical example of Vapnik-Chervonenkis classes of Borel subsets of [0, 1] 2 are established. The results can be easily generalized to other Vapnik-Chervonenkis classes. The frequency domain bootstrap method is simple to apply and is demonstrated to be effective in various applications including constructing confidence intervals of correlograms for linear random fields, testing for signal presence using scan statistics, and testing for spatial isotropy in Gaussian random fields. Simulation studies are conducted to illustrate the finite sample performance of the proposed method and to compare with the existing spatial block bootstrap and subsampling methods. For small or moderate sample sizes, since the effective number of blocks for block bootstrap and subsampling methods would be small when the block size is chosen to be large, severe bias would be induced, and the finite sample performances are sensitive to the selection of block size. However, this problem cannot be resolved by choosing a smaller block size as the dependency structure of the underlying spatial fields cannot be preserved when the block size is too small. On the other hand, although the bandwidth selection may also affect the performance of the proposed frequency domain bootstrap for small or moderate sample sizes, its effect is relatively small compared to that of block bootstrap as shown in the simulation studies in Section 6. However, as shown in Section 4, the proposed frequency domain bootstrap can only mimic the second order moment structure of the underlying spatial fields, and hence may not be appropriate for general statistics which involve higher moment structures, or may require some form of transformations of the data. On the other hand, in general the block bootstrap methods can be applied directly without any transformations beforehand.

Frequency domain bootstrap for random fields
Using Lemma A.1, we can handle the above sum by decomposing the inmost sum into the 5 terms. For the first term, by the absolute summability of γ(·), we have Also, by the absolute summability of γ(·), we have For the sum of the last three terms of Lemma A.1, we have Similar arguments can be applied on the remaining two terms. Finally, the sum of the second term is Putting everything together, we obtain (b). The proof of (c) is analogous. A simple calculation shows that Cov(Z(l 1 ), Z(l 2 )) = Cov(V (l 1 ), V (l 2 )) + o(1) by the absolute summability of the auto-covariance function.
Proof of Theorem 4.3. To prove the invariance principle of the Q 2 -indexed partial sum process of the bootstrap sample, we have to show the finite-dimensional convergence and also the tightness of the partial sum process. The following lemma shows the convergence of the finite-dimensions distribution of the Q 2indexed partial sum process of the bootstrap sample. Its proof is deferred to Appendix B. A.2, B.1 and B.4 are fulfilled, for B 1 , B 2

(b) If Assumptions
The following lemma gives the critical step towards tightness of the Q 2indexed partial sum process of the bootstrap sample. Its proof is deferred to Appendix B. A.2, B.1 to B.3, for any > 0 and A T ∈ Q 2 , we have

Lemma A.3. Under Assumptions
Theorem 13.5 of [1] gives a characterization of weak convergence via convergence of the finite-dimensions distributions as well as tightness. Lemmas A.2 and A.3 show that these conditions are fulfilled. It completes the proof of Theorem 4.3.
Proof of Corollary 4.4. The proof is analogous to the proof of Lemma A.2(a) and is thus omitted.
Proof of Corollary 4.5. The corollary is an immediate consequence of Theorem 4.3, thus the proof is omitted.

A.2. Proofs of Section 5
Proof of Theorem 5.1. In the following we only prove the assertions for x * (·), while the assertions for y * (·) follow because x * (j) d = y * (j) (conditionally given V (·)). Since Assumption B.1 directly follows from the definition of the bootstrap schemes, we will show that Assumptions B.2 to B.4 are also valid under the assumptions stated in the theorem.
Also, by Theorem 4.3 in [33], we also have the following four conditions on the sums of the periodograms and Fourier coefficients: for some constant C 1 , C 2 ≥ 0 and q = 4 + with some ∈ (0, 1). Next, since we have for k = 1, 2, Also, we have for k = 1, 2, Hence, Assumption B.3 holds with p = 2q > 8 since  2 , j ∈ N and k = 1, 2. Since s * j,k are from the standardized residues, we have E * (s * 1,k ) 2 = 1, From this, it follows that The last line follows from the uniform convergence of the empirical distribution function of the Fourier coefficients. Note that convergence in Mallows distance is equivalent to convergence in distribution in addition to convergence of the first two moments. In this case, the convergences in all three cases hold uniformly in j.
Proof of Corollary 5.2. It follows directly from Theorem 6.1 in [33] and the above proof of Theorem 5.1.

A.3. Proofs of Section 6
Proof of Theorem 6.1. By the conditions stated in the theorem, it is easy to see that { V (t)} satisfies the condition in Corollary 5.2. Since Z T ⊂ Q 2 , the results follows from Theorem 4.3(a).
Proof of Theorem 6.2. It is easy to see that the scan statistics diverges to infinity under the conditions stated in the theorem.
For the first term J 1 , For the second term J 2 , since the underlying field is Gaussian, the Fourier coefficients x(i) and y(i) are Gaussian. By the Gaussianity of x(i) and y(i), asymptotic independence and strong law of large numbers, we have K *