VQ-AR: Vector Quantized Autoregressive Probabilistic Time Series Forecasting

  title={VQ-AR: Vector Quantized Autoregressive Probabilistic Time Series Forecasting},
  author={Kashif Rasul and Young-Jin Park and Max Nihl{\'e}n Ramstr{\"o}m and KyungHyun Kim},
Time series models aim for accurate predictions of the future given the past, where the forecasts are used for important downstream tasks like business decision making. In practice, deep learning based time series models come in many forms, but at a high level learn some continuous representation of the past and use it to output point or probabilistic forecasts. In this paper, we introduce a novel autoregressive architecture, VQ-AR, which instead learns a discrete set of representations that… 

Figures and Tables from this paper



The Effectiveness of Discretization in Forecasting: An Empirical Study on Neural Time Series Models

It is found that binning almost always improves performance (compared to using normalized real-valued inputs), but that the particular type of binning chosen is of lesser importance.

Deep State Space Models for Time Series Forecasting

A novel approach to probabilistic time series forecasting that combines state space models with deep learning by parametrizing a per-time-series linear state space model with a jointly-learned recurrent neural network, which compares favorably to the state-of-the-art.

Temporal Fusion Transformers for Interpretable Multi-horizon Time Series Forecasting

Autoregressive Denoising Diffusion Models for Multivariate Probabilistic Time Series Forecasting

TimeGrad, an autoregressive model for multivariate probabilistic time series forecasting which samples from the data distribution at each time step by estimating its gradient, is proposed.

Modeling Long- and Short-Term Temporal Patterns with Deep Neural Networks

A novel deep learning framework, namely Long- and Short-term Time-series network (LSTNet), to address this open challenge of multivariate time series forecasting, using the Convolution Neural Network and the Recurrent Neural Network to extract short-term local dependency patterns among variables and to discover long-term patterns for time series trends.

Normalizing Kalman Filters for Multivariate Time Series Analysis

This paper tackles the modelling of large, complex and multivariate time series panels in a probabilistic setting by augmenting state space models with normalizing flows, and demonstrates competitiveness against state-of-the-art deep learning methods on the tasks of forecasting real world data and handling varying levels of missing data.

High-Dimensional Multivariate Forecasting with Low-Rank Gaussian Copula Processes

This work proposes to combine an RNN-based time series model with a Gaussian copula process output model withA low-rank covariance structure to reduce the computational complexity and handle non-Gaussian marginal distributions.

Conformal Time-series Forecasting

This work extends the inductive conformal prediction framework to the time-series forecasting setup, and proposes a lightweight uncertainty estimation procedure to address the above limitations and provides uncertainty intervals with theoretical guarantees on frequentist coverage for any multi-horizon forecast predictor and any dataset.

A Multi-Horizon Quantile Recurrent Forecaster

We propose a framework for general probabilistic multi-step time series regression. Specifically, we exploit the expressiveness and temporal nature of Recurrent Neural Networks, the nonparametric