• Corpus ID: 31909811

SELECTION OF A MULTISTEP LINEAR PREDICTOR FOR SHORT TIME SERIES

@inproceedings{Hurvich1997SELECTIONOA,
  title={SELECTION OF A MULTISTEP LINEAR PREDICTOR FOR SHORT TIME SERIES},
  author={Clifford M. Hurvich and Chih-Ling Tsai},
  year={1997}
}
We develop a version of the Corrected Akaike Information Criterion (AICC) suitable for selection of an h-step-ahead linear predictor for a weakly sta- tionary time series in discrete time. A motivation for this criterion is provided in terms of a generalized Kullback-Leibler information which is minimized at the optimal h-step predictor, and which is equivalent to the ordinary Kullback-Leibler information when h = 1. In a simulation study, we find that if the sample size is small and the… 

Figures and Tables from this paper

Selecting optimal multistep predictors for autoregressive processes of unknown order
TLDR
It is shown that when both plug-in and direct predictors are considered, the optimal multistep prediction results cannot be guaranteed by correctly identifying the underlying model's order, which challenges the traditional model (order) selection criteria.
Discussion of “High-dimensional autocovariance matrices and optimal linear prediction”
TLDR
There are three potential uses of McMurry and Politis' methods for data analysis that I would like to explore, and the proposed FSO or PSO predictors could be used for forecasting real time series.
Multifold Predictive Validation in ARMAX Time Series Models
This article presents a new procedure for multifold predictive validation in time series. The procedure is based on the so-called “filtered residuals,” in-sample prediction errors evaluated in such a
The Multistep Beveridge-Nelson Decomposition Tommaso Proietti
The Beveridge-Nelson decomposition defines the trend component in terms of the eventual forecast function, as the value the series would take if it were on its long-run path. The paper introduces the
The Multistep Beveridge–Nelson Decomposition
The Beveridge–Nelson decomposition defines the trend component in terms of the eventual forecast function, as the value the series would take if it were on its long-run path. The article introduces
Better ACF and PACF plots , but no optimal linear prediction
TLDR
There are three potential uses of McMurry and Politis' methods for data analysis that I would like to explore, and the proposed FSO or PSO predictors could be used for forecasting real time series.
The Multistep Beveridge-Nelson Decomposition Tommaso Proietti Business
The Beveridge-Nelson decomposition defines the trend component in terms of the eventual forecast function, as the value the series would take if it were on its long-run path. The paper introduces the
Blending as a Multi-Horizon Time Series Forecasting Tool
BLENDING AS A MULTI-HORIZON TIME SERIES FORECASTING TOOL Tian Gao, B.S. Marquette University, 2014 Every day, millions of cubic feet of natural gas is transported through interstate pipelines and
...
...

References

SHOWING 1-10 OF 22 REFERENCES
Regression and time series model selection in small samples
SUMMARY A bias correction to the Akaike information criterion, AIC, is derived for regression and autoregressive time series models. The correction is of particular use when the sample size is small,
Bias of the corrected AIC criterion for underfitted regression and time series models
TLDR
A simulation study in which the true model is an infinite-order autoregression shows that, even in moderate sample sizes, AICC provides substantially better model selections than AIC.
Automatic selection of a linear predictor through frequency domain cross-validation
Given data from a weakly stationary stochastic process in discrete time, and any L-step ahead linear predictor estimated from that data, we will construct an approximately unbiased estimator of the
Robustness of maximum likelihood estimates for multi-step predictions: The exponential smoothing case
SUMMARY We extend the argument initiated by Cox (1961) that the exponential smoothing formula can be made more robust for multi-step forecasts if the smoothing parameter is adjusted as a function of
Order Determination for Multivariate Autoregressive Processes Using Resampling Methods
LetX1, ?, Xnbe observations from a multivariate AR(p) model with unknown orderp. A resampling procedure is proposed for estimating the orderp. The classical criteria, such as AIC and BIC, estimate
A Study of Autoregressive and Window Spectral Estimation
SUMMARY a process with a "mixed" spectrum. The paper also includes some discussion of two different methods of estimating the coefficients of AR models (the Burg method and the Yule-Walker approach),
The Bias of Autoregressive Coefficient Estimators
Abstract This article presents simple expressions for the bias of estimators of the coefficients of an autoregressive model of arbitrary, but known, finite order. The results include models both with
‘Bias of some commonly-used time series estimates’
SUMMARY We study the bias of Yule-Walker and least squares estimates for univariate and multivariate autoregressive processes. We obtain explicit formulae for the large-sample bias of Yule-Walker
Some advances in non‐linear and adaptive modelling in time‐series
TLDR
This paper shows that linear models can provide accurate forecasts provided that the parameters involved are estimated adaptively and focuses on forecasting long-memory time series analysis.
...
...