On optimal block resampling for Gaussian-subordinated long-range dependent processes
Block resampling methods are useful for nonparametrically approximating the sampling distributions of statistics from dependent data. Much research has focused on weakly dependent time processes and understanding the large-sample properties of block subsampling (and bootstrap) methods, which has helped to inform implementation through the choice of the best block sizes, particularly for inference about sample means (as a prototypical statistic). However, relatively little is known about resampling performance and best block sizes under strong- or long-range time dependence. We consider a broad class of strongly dependent and possibly non-linear time series, which are formed by a transformation of a stationary long-memory Gaussian series. We determine the estimation error and best block sizes for subsampling (or block bootstrap) variance estimation of the sample mean from such processes. Explicit expressions are given for the bias and variance of block subsampling/bootstrap estimators with overlapping or non-overlapping blocks, which depend intricately amount of non-linearity in the time series as well as a strong dependence coefficient. In contrast, for weakly dependent time series, bias/variance properties of subsampling/bootstrap estimators are completely invariant to the degree of non-linearity in the time series (i.e., a non-issue), and overlapping blocks always induce better performance than non-overlapping blocks regardless of the exact block length choice. However, neither of these aspects remains true for transformation-based long memory time series and, perhaps surprisingly, any amount of non-linearity in the time series destroys advantages of overlapping blocks.