• Corpus ID: 233033415

deepregression: a Flexible Neural Network Framework for Semi-Structured Deep Distributional Regression

@article{Rgamer2021deepregressionAF,
  title={deepregression: a Flexible Neural Network Framework for Semi-Structured Deep Distributional Regression},
  author={D. R{\"u}gamer and Ruolin Shen and Christina Bukas and Lisa Barros de Andrade e Sousa and Dominik Thalmeier and Nadja Klein and Chris Kolb and Florian Pfisterer and Philipp Kopper and B. Bischl and Christian L. M{\"u}ller},
  journal={ArXiv},
  year={2021},
  volume={abs/2104.02705}
}
In this paper we describe the implementation of semi-structured deep distributional regression, a flexible framework to learn conditional distributions based on the combination of additive regression models and deep networks. Our implementation encompasses (1) a modular neural network building system based on the deep learning library TensorFlow for the fusion of various statistical and deep learning approaches, (2) an orthogonalization cell to allow for an interpretable combination of… 

Figures and Tables from this paper

Additive Higher-Order Factorization Machines
TLDR
It is proved both theoretically and empirically that the derived high-order tensor product spline model scales notably better than existing approaches, derive meaningful penalization schemes and also discusses further theoretical aspects.
DeepPAMM: Deep Piecewise Exponential Additive Mixed Models for Complex Hazard Structures in Survival Analysis
TLDR
DeepPAMM is proposed, a versatile deep learning framework that is well-founded from a statistical point of view, yet with enough flexibility for modeling complex hazard structures and is competitive with other machine learning approaches with respect to predictive performance while maintaining interpretability.
M5 Competition Uncertainty: Overdispersion, distributional forecasting, GAMLSS and beyond
  • F. Ziel
  • Computer Science
    International Journal of Forecasting
  • 2021
Transforming Autoregression: Interpretable and Expressive Time Series Forecast
TLDR
Autoregressive Transformation Models (ATMs), a model class inspired from various research directions such as normalizing flows and autoregressive models that unite expressive distributional forecasts using a semi-parametric distribution assumption with an interpretable model specification and allow for uncertainty quantification based on (asymptotic) Maximum Likelihood theory are proposed.

References

SHOWING 1-10 OF 47 REFERENCES
A Unified Network Architecture for Semi-Structured Deep Distributional Regression
We propose a unified network architecture for deep distributional regression in which entire distributions can be learned in a general framework of interpretable regression models and deep neural
Deep Conditional Transformation Models
TLDR
The class of deep conditional transformation models are introduced which unify existing approaches and allow to learn both interpretable (non-)linear model terms and more complex predictors in one holistic neural network.
Neural Mixture Distributional Regression
TLDR
This work presents neural mixture distributional regression (NMDR), a holistic framework to estimate complex finite mixtures of distributional regressions defined by flexible additive predictors that makes use of optimizers well-established in deep learning.
Semi-Structured Deep Piecewise Exponential Models
TLDR
A versatile framework for survival analysis that combines advanced concepts from statistics with deep learning and enables the simultaneous estimation of both inherently interpretable structured regression inputs as well as deep neural network components which can potentially process additional unstructured data sources is proposed.
Laplace Redux - Effortless Bayesian Deep Learning
TLDR
This work reviews the range of variants of the Laplace approximation, an easy-to-use software library for PyTorch offering user-friendly access to all major versions of the LA, and demonstrates that the LA is competitive with more popular alternatives in terms of performance, while excelling in Terms of computational cost.
LassoNet: Neural Networks with Feature Sparsity
TLDR
This work introduces LassoNet, a neural network framework with global feature selection that uses a modified objective function with constraints, and so integrates feature selection with the parameter learning directly, and delivers an entire regularization path of solutions with a range of feature sparsity.
Bayesian Deep Learning and a Probabilistic Perspective of Generalization
TLDR
It is shown that deep ensembles provide an effective mechanism for approximate Bayesian marginalization, and a related approach is proposed that further improves the predictive distribution by marginalizing within basins of attraction, without significant overhead.
Equivalences Between Sparse Models and Neural Networks
We present some observations about neural networks that are, on the one hand, the result of fairly trivial algebraic manipulations, and on the other hand, potentially noteworthy and deserving of
A Simple Baseline for Bayesian Uncertainty in Deep Learning
TLDR
It is demonstrated that SWAG performs well on a wide variety of tasks, including out of sample detection, calibration, and transfer learning, in comparison to many popular alternatives including MC dropout, KFAC Laplace, SGLD, and temperature scaling.
Bayesian Layers: A Module for Neural Network Uncertainty
We describe Bayesian Layers, a module designed for fast experimentation with neural network uncertainty. It extends neural network libraries with drop-in replacements for common layers. This enables
...
...