• Corpus ID: 246652511

TACTiS: Transformer-Attentional Copulas for Time Series

@inproceedings{Drouin2022TACTiSTC,
  title={TACTiS: Transformer-Attentional Copulas for Time Series},
  author={Alexandre Drouin and 'E. Marcotte and Nicolas Chapados},
  booktitle={ICML},
  year={2022}
}
The estimation of time-varying quantities is a fundamental component of decision making in fields such as healthcare and finance. However, the practical utility of such estimates is limited by how accurately they quantify predictive uncertainty. In thiswork, weaddresstheproblemofestimatingthe joint predictive distribution of high-dimensional multivariate time series. We propose a versatile method, based on the transformer architecture, that estimates joint distributions using an attention-based… 
Inference and Sampling for Archimax Copulas
TLDR
The proposed methods, to the best of the knowledge, are the first that allow for highly flexible and scalable inference and sampling algorithms, enabling the increased use of Archimax copulas in practical settings.

References

SHOWING 1-10 OF 42 REFERENCES
Probabilistic Transformer For Time Series Analysis
TLDR
Deep probabilistic methods that combine state-space models (SSMs) with transformer architectures are proposed that use attention mechanism to model non-Markovian dynamics in the latent space and avoid recurrent neural networks entirely.
Stanza: A Nonlinear State Space Model for Probabilistic Inference in Non-Stationary Time Series
TLDR
Stanza strikes a balance between competitive forecasting accuracy and probabilistic, interpretable inference for highly structured time series, achieving forecasting accuracy competitive with deep LSTMs on real-world datasets, especially for multi-step ahead forecasting.
Multi-Time Attention Networks for Irregularly Sampled Time Series
TLDR
This work is motivated by the analysis of physiological time series data in electronic health records, which are sparse, irregularly sampled, and multivariate, and proposes a new deep learning framework that is called Multi-Time Attention Networks.
Multi-variate Probabilistic Time Series Forecasting via Conditioned Normalizing Flows
TLDR
This work model the multi-variate temporal dynamics of time series via an autoregressive deep learning model, where the data distribution is represented by a conditioned normalizing flow, which improves over the state-of-the-art for standard metrics on many real-world data sets with several thousand interacting time-series.
N-BEATS: Neural basis expansion analysis for interpretable time series forecasting
TLDR
The proposed deep neural architecture based on backward and forward residual links and a very deep stack of fully-connected layers has a number of desirable properties, being interpretable, applicable without modification to a wide array of target domains, and fast to train.
High-Dimensional Multivariate Forecasting with Low-Rank Gaussian Copula Processes
TLDR
This work proposes to combine an RNN-based time series model with a Gaussian copula process output model withA low-rank covariance structure to reduce the computational complexity and handle non-Gaussian marginal distributions.
Temporal pattern attention for multivariate time series forecasting
TLDR
This paper proposes using a set of filters to extract time-invariant temporal patterns, similar to transforming time series data into its “frequency domain”, and proposes a novel attention mechanism to select relevant time series, and uses its frequency domain information for multivariate forecasting.
PyTorchTS, 2021a. URL https: //github.com/zalandoresearch/pytorch-ts
  • 2021
Copula-based semiparametric models for multivariate time series
TLDR
The authors extend to multivariate contexts the copula-based univariate time series modeling approach of Chen & Fan and discusses parameter estimation and goodness-of-fit testing for their model, with emphasis on meta-elliptical and Archimedean copulas.
...
...