Time Series Forecasting with Gaussian Processes Needs Priors

@inproceedings{Corani2020TimeSF,
  title={Time Series Forecasting with Gaussian Processes Needs Priors},
  author={Giorgio Corani and Alessio Benavoli and Marco Zaffalon},
  booktitle={ECML/PKDD},
  year={2020}
}
. Automatic forecasting is the task of receiving a time series and returning a forecast for the next time steps without any human in-tervention. Gaussian Processes (GPs) are a powerful tool for modeling time series, but so far there are no competitive approaches for automatic forecasting based on GPs. We propose practical solutions to two problems: automatic selection of the optimal kernel and reliable estimation of the hyperparameters. We propose a fixed composition of kernels, which contains… 

Correlated Product of Experts for Sparse Gaussian Process Regression

This paper focuses on GP regression tasks and proposes a new approach based on aggregating predictions from several local and correlated experts, which can deal with a general kernel function and multiple variables, and has a time and space complexity which is linear in the number of experts and data samples, which makes the approach highly scalable.

State Space Approximation of Gaussian Processes for Time Series Forecasting

The SS representation is applied to time series forecasting showing that SS models provide a performance comparable with that of full GP and better than state-ofthe-art models (arima, ETS).

DeepTIMe: Deep Time-Index Meta-Learning for Non-Stationary Time-Series Forecasting

This paper proposes DeepTIMe, a deep time-index based model trained via a meta-learning formulation which overcomes limitations, yielding an efficient and accurate forecasting model and achieves competitive results with state-of-the-art methods.

Predictive Whittle networks for time series

A novel Whittle forecasting loss is proposed that makes use of predictive likelihoods to guide the training of the neural forecasting component and allows a transformation back into the time domain, in order to provide the necessary feedback of when the model’s prediction may become erratic.

DynaConF: Dynamic Forecasting of Non-Stationary Time-Series

This work proposes a new method to model non-stationary conditional distributions over time by clearly decoupling stationary conditional distribution modeling from non- stationary dynamics modeling.

Time Series Forecasting Using Manifold Learning

A three-tier numerical framework based on manifold learning for the forecasting of highdimensional time series is addressed and a comparison with the Principal Component Analysis algorithm as well as with the naive random walk model and the MVAR and GPR models trained and implemented directly in the high-dimensional space are provided.

Gaussian Processes for Hierarchical Time Series Forecasting

This work proposes a covariance mixture that allows the covariance matrices of the Gaussian Processes to share parameters and learn dependencies between series and reduces the overall number of parameters required, and shows very significant improvements across hierarchical levels consistently for all datasets.

References

SHOWING 1-10 OF 36 REFERENCES

Forecasting Time Series With Complex Seasonal Patterns Using Exponential Smoothing

An innovations state space modeling framework is introduced for forecasting complex seasonal time series such as those with multiple seasonal periods, high-frequency seasonality, non-integer

Fast and Scalable Gaussian Process Modeling with Applications to Astronomical Time Series

A novel method for Gaussian processes modeling in one dimension where the computational requirements scale linearly with the size of the data set, and is fast and interpretable, with a range of potential applications within astronomical data analysis and beyond.

Automatic Time Series Forecasting: The forecast Package for R

Two automatic forecasting algorithms that have been implemented in the forecast package for R, based on innovations state space models that underly exponential smoothing methods, are described.

Gaussian Processes for Machine Learning

The treatment is comprehensive and self-contained, targeted at researchers and students in machine learning and applied statistics, and deals with the supervised learning problem for both regression and classification.

Optimal Forecast Reconciliation for Hierarchical and Grouped Time Series Through Trace Minimization

A new forecast reconciliation approach is proposed that incorporates the information from a full covariance matrix of forecast errors in obtaining a set of coherent forecasts and minimizes the mean squared error of the coherent forecasts across the entire collection of time series under the assumption of unbiasedness.

Scalable Variational Bayesian Kernel Selection for Sparse Gaussian Process Regression

Though it is computationally challenging to jointly optimize a large number of hyperparameters due to many kernels being evaluated simultaneously by the VBKS algorithm, it is shown that the variational lower bound of the log-marginal likelihood can be decomposed into an additive form such that each additive term depends only on a disjoint subset of the Variational variables and can thus be optimized independently.

Fast Direct Methods for Gaussian Processes

This work shows that for the most commonly used covariance functions, the matrix C can be hierarchically factored into a product of block low-rank updates of the identity matrix, yielding an O(n log2 n) algorithm for inversion and enables the evaluation of the determinant det(C), permitting the direct calculation of probabilities in high dimensions under fairly broad assumptions on the kernel defining K.

Bayesian optimization for automated model selection

This work presents a sophisticated method for automatically searching for an appropriate kernel from an infinite space of potential choices, based on Bayesian optimization in model space, and constructs a novel kernel between models to explain a given dataset.