Time Series Forecasting with Gaussian Processes Needs Priors

@inproceedings{Corani2021TimeSF,
  title={Time Series Forecasting with Gaussian Processes Needs Priors},
  author={Giorgio Corani and Alessio Benavoli and Marco Zaffalon},
  booktitle={ECML/PKDD},
  year={2021}
}
. Automatic forecasting is the task of receiving a time series and returning a forecast for the next time steps without any human in-tervention. Gaussian Processes (GPs) are a powerful tool for modeling time series, but so far there are no competitive approaches for automatic forecasting based on GPs. We propose practical solutions to two problems: automatic selection of the optimal kernel and reliable estimation of the hyperparameters. We propose a fixed composition of kernels, which contains… 

Correlated Product of Experts for Sparse Gaussian Process Regression

TLDR
This paper focuses on GP regression tasks and proposes a new approach based on aggregating predictions from several local and correlated experts, which can deal with a general kernel function and multiple variables, and has a time and space complexity which is linear in the number of experts and data samples, which makes the approach highly scalable.

State Space Approximation of Gaussian Processes for Time Series Forecasting

TLDR
The SS representation is applied to time series forecasting showing that SS models provide a performance comparable with that of full GP and better than state-ofthe-art models (arima, ETS).

DynaConF: Dynamic Forecasting of Non-Stationary Time-Series

Deep learning models have shown impressive results in a variety of time series forecasting tasks, where modeling the conditional distribution of the future given the past is the essence. However,

Time Series Forecasting Using Manifold Learning

TLDR
A three-tier numerical framework based on manifold learning for the forecasting of highdimensional time series is addressed and a comparison with the Principal Component Analysis algorithm as well as with the naive random walk model and the MVAR and GPR models trained and implemented directly in the high-dimensional space are provided.

References

SHOWING 1-10 OF 36 REFERENCES

Forecasting Time Series With Complex Seasonal Patterns Using Exponential Smoothing

An innovations state space modeling framework is introduced for forecasting complex seasonal time series such as those with multiple seasonal periods, high-frequency seasonality, non-integer

Fast and Scalable Gaussian Process Modeling with Applications to Astronomical Time Series

TLDR
A novel method for Gaussian processes modeling in one dimension where the computational requirements scale linearly with the size of the data set, and is fast and interpretable, with a range of potential applications within astronomical data analysis and beyond.

Automatic Time Series Forecasting: The forecast Package for R

TLDR
Two automatic forecasting algorithms that have been implemented in the forecast package for R, based on innovations state space models that underly exponential smoothing methods, are described.

Gaussian Processes for Machine Learning

TLDR
The treatment is comprehensive and self-contained, targeted at researchers and students in machine learning and applied statistics, and deals with the supervised learning problem for both regression and classification.

Optimal Forecast Reconciliation for Hierarchical and Grouped Time Series Through Trace Minimization

TLDR
A new forecast reconciliation approach is proposed that incorporates the information from a full covariance matrix of forecast errors in obtaining a set of coherent forecasts and minimizes the mean squared error of the coherent forecasts across the entire collection of time series under the assumption of unbiasedness.

Scalable Variational Bayesian Kernel Selection for Sparse Gaussian Process Regression

TLDR
Though it is computationally challenging to jointly optimize a large number of hyperparameters due to many kernels being evaluated simultaneously by the VBKS algorithm, it is shown that the variational lower bound of the log-marginal likelihood can be decomposed into an additive form such that each additive term depends only on a disjoint subset of the Variational variables and can thus be optimized independently.

Fast Direct Methods for Gaussian Processes

TLDR
This work shows that for the most commonly used covariance functions, the matrix C can be hierarchically factored into a product of block low-rank updates of the identity matrix, yielding an O(n log2 n) algorithm for inversion and enables the evaluation of the determinant det(C), permitting the direct calculation of probabilities in high dimensions under fairly broad assumptions on the kernel defining K.

Bayesian optimization for automated model selection

TLDR
This work presents a sophisticated method for automatically searching for an appropriate kernel from an infinite space of potential choices, based on Bayesian optimization in model space, and constructs a novel kernel between models to explain a given dataset.