Learn More
A new class of large-sample covariance and spectral density matrix estimators is proposed based on the notion of flat-top kernels. The new estimators are shown to be higher-order accurate when higher-order accuracy is possible. A discussion on kernel choice is presented as well as a supporting finite-sample simulation. The problem of spectral estimation(More)
Consider the problem of inference for a parameter of a stationary time series, where the parameter takes values in a metric space (such as a function space). In this paper, we develop asymptotic theory based on subsampling to approximate the distribution of estimators for such parameters. The reason for this level of abstraction is to be able to consider(More)
A nonparametric bootstrap procedure is proposed for stochastic processes which follow a general autoregressive structure. The procedure generates bootstrap replicates by locally resampling the original set of observations reproducing automatically its dependence properties. It avoids an initial nonparametric estimation of process characteristics in order to(More)
A new methodology for optimal linear prediction of a stationary time series is introduced. is routinely truncated to its first p components in order to be consistently estimated. By contrast, we employ a consistent estimator of the n × n autocovariance matrix Γ n in order to construct a consistent estimator of the optimal, full-length coefficient vector(More)
In order to construct prediction intervals without the cumbersome—and typically unjustifiable—assumption of Gaussianity, some form of resampling is necessary. The regression setup has been well-studied in the literature but time series prediction faces additional difficulties. The paper at hand focuses on time series that can be modeled as linear, nonlinear(More)
We address the problem of estimating the autocovariance matrix of a stationary process. Under short range dependence assumptions, convergence rates are established for a gradually tapered version of the sample autocovariance matrix and for its inverse. The proposed estima-tor is formed by leaving the main diagonals of the sample autocovariance matrix intact(More)
The problem of nonparametric regression is addressed, and a kernel smoothing estimator is proposed which has favorable asymptotic performance (bias, variance, and mean squared error). The proposed class of kernels is characterized by a Fourier transform which is flat near the origin and infinitely differentiable. This property allows the bias of the(More)