Probabilistic time series forecasts with autoregressive transformation models

@article{Rgamer2021ProbabilisticTS,
  title={Probabilistic time series forecasts with autoregressive transformation models},
  author={David R{\"u}gamer and Philipp F. M. Baumann and Thomas Kneib and Torsten Hothorn},
  journal={Statistics and Computing},
  year={2021},
  volume={33}
}
Probabilistic forecasting of time series is an important matter in many applications and research fields. In order to draw conclusions from a probabilistic forecast, we must ensure that the model class used to approximate the true forecasting distribution is expressive enough. Yet, characteristics of the model itself, such as its uncertainty or its feature-outcome relationship are not of lesser importance. This paper proposes Autoregressive Transformation Models (ATMs), a model class inspired… 

Estimating Conditional Distributions with Neural Networks using R package deeptrafo

Deeptrafo is presented, a package for fitting flexible regression models for conditional distributions using a tensorflow backend with numerous additional processors, such as neural networks, penalties, and smoothing splines, for deep conditional transformation models (DCTMs).

Distributional Gradient Boosting Machines

A probabilistic boosting framework for regression tasks that models and predicts the entire conditional distribution of a univariate response variable as a function of covariates and achieves state-of-the-art forecast accuracy.

mixdistreg: An R Package for Fitting Mixture of Experts Distributional Regression with Adaptive First-order Methods

This paper presents a high-level description of the R software package mixdistreg to fit mixture of experts distributional regression models, which comprises various approaches as special cases, including mixture density networks and mixture regression approaches.

deepregression: a Flexible Neural Network Framework for Semi-Structured Deep Distributional Regression

The implementation of semi-structured deep distributional regression is described, a flexible framework to learn conditional distributions based on the combination of additive regression models and deep networks, which allows for state-of-the-art predictive performance while simultaneously retaining the indispensable interpretability of classical statistical models.

Why Did This Model Forecast This Future? Closed-Form Temporal Saliency Towards Causal Explanations of Probabilistic Forecasts

This work proposes to express the saliency of an observed window in terms of the differential entropy of the resulting predicted future distribution, and obtains a closed-form solution for theSaliency map for commonly used density functions in probabilistic forecasting.

References

SHOWING 1-10 OF 67 REFERENCES

Normalizing Flows for Probabilistic Modeling and Inference

This review places special emphasis on the fundamental principles of flow design, and discusses foundational topics such as expressive power and computational trade-offs, and summarizes the use of flows for tasks such as generative modeling, approximate inference, and supervised learning.

Conditional transformation models

The ultimate goal of regression analysis is to obtain information about the conditional distribution of a response given a set of explanatory variables. This goal is, however, seldom achieved because

A general asymptotic theory for time‐series models

This paper develops a general asymptotic theory for the estimation of strictly stationary and ergodic time–series models. Under simple conditions that are straightforward to check, we establish the

Deep interpretable ensembles

This work proposes a novel transformation ensemble which aggregates probabilistic predictions with the guarantee to preserve interpretability and yield uniformly better predictions than the ensemble members on average, and demonstrates how transformation ensembles quantify both aleatoric and epistemic uncertainty, and produce minimax optimal predictions under certain conditions.

Implicit Copulas: An Overview

deepregression: a Flexible Neural Network Framework for Semi-Structured Deep Distributional Regression

The implementation of semi-structured deep distributional regression is described, a flexible framework to learn conditional distributions based on the combination of additive regression models and deep networks, which allows for state-of-the-art predictive performance while simultaneously retaining the indispensable interpretability of classical statistical models.

GluonTS: Probabilistic and Neural Time Series Modeling in Python

We introduce the Gluon Time Series Toolkit (GluonTS), a Python library for deep learning based time series modeling for ubiquitous tasks, such as forecasting and anomaly detection. GluonTS simplifies

Deep Conditional Transformation Models

The class of deep conditional transformation models are introduced which unify existing approaches and allow to learn both interpretable (non-)linear model terms and more complex predictors in one holistic neural network.

Neural Mixture Distributional Regression

This work presents neural mixture distributional regression (NMDR), a holistic framework to estimate complex finite mixtures of distributional regressions defined by flexible additive predictors that makes use of optimizers well-established in deep learning.
...