DynaConF: Dynamic Forecasting of Non-Stationary Time-Series

@article{Liu2022DynaConFDF,
  title={DynaConF: Dynamic Forecasting of Non-Stationary Time-Series},
  author={Siqi Liu and Andreas M. Lehrmann},
  journal={ArXiv},
  year={2022},
  volume={abs/2209.08411}
}
Deep learning models have shown impressive results in a variety of time series forecasting tasks, where modeling the conditional distribution of the future given the past is the essence. However, when this conditional distribution is non-stationary, it poses challenges for these models to learn consistently and to predict accu-rately. In this work, we propose a new method to model non-stationary conditional distributions over time by clearly decoupling stationary conditional distribution… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 50 REFERENCES

Multi-variate Probabilistic Time Series Forecasting via Conditioned Normalizing Flows

This work model the multi-variate temporal dynamics of time series via an autoregressive deep learning model, where the data distribution is represented by a conditioned normalizing flow, which improves over the state-of-the-art for standard metrics on many real-world data sets with several thousand interacting time-series.

Deep State Space Models for Time Series Forecasting

A novel approach to probabilistic time series forecasting that combines state space models with deep learning by parametrizing a per-time-series linear state space model with a jointly-learned recurrent neural network, which compares favorably to the state-of-the-art.

Probabilistic Transformer For Time Series Analysis

Deep probabilistic methods that combine state-space models (SSMs) with transformer architectures are proposed that use attention mechanism to model non-Markovian dynamics in the latent space and avoid recurrent neural networks entirely.

DeepAR: Probabilistic Forecasting with Autoregressive Recurrent Networks

Temporal Latent Auto-Encoder: A Method for Probabilistic Multivariate Time Series Forecasting

A novel temporal latent auto-encoder method which enables nonlinear factorization of multivariate time series, learned end-to-end with a temporal deep learning latent space forecast model, and achieves state-of-the-art performance on many popular multivariate datasets.

Deep Rao-Blackwellised Particle Filters for Time Series Forecasting

This work addresses efficient inference and learning in switching Gaussian linear dynamical systems using a Rao-Blackwellised particle filter and a corresponding Monte Carlo objective using an auxiliary variable approach with a decoder-type neural network that allows for more complex non-linear emission models and multivariate observations.

Time-series forecasting with deep learning: a survey

This article surveys common encoder and decoder designs used in both one-step-ahead and multi-horizon time-series forecasting—describing how temporal information is incorporated into predictions by each model.

High-Dimensional Multivariate Forecasting with Low-Rank Gaussian Copula Processes

This work proposes to combine an RNN-based time series model with a Gaussian copula process output model withA low-rank covariance structure to reduce the computational complexity and handle non-Gaussian marginal distributions.

N-BEATS: Neural basis expansion analysis for interpretable time series forecasting

The proposed deep neural architecture based on backward and forward residual links and a very deep stack of fully-connected layers has a number of desirable properties, being interpretable, applicable without modification to a wide array of target domains, and fast to train.

Modeling Long- and Short-Term Temporal Patterns with Deep Neural Networks

A novel deep learning framework, namely Long- and Short-term Time-series network (LSTNet), to address this open challenge of multivariate time series forecasting, using the Convolution Neural Network and the Recurrent Neural Network to extract short-term local dependency patterns among variables and to discover long-term patterns for time series trends.