Probabilistic Forecasting with Generative Networks via Scoring Rule Minimization
@inproceedings{Pacchiardi2021ProbabilisticFW, title={Probabilistic Forecasting with Generative Networks via Scoring Rule Minimization}, author={Lorenzo Pacchiardi and Rilwan Adewoyin and Peter Dominik Dueben and Ritabrata Dutta}, year={2021} }
Generative networks are often trained to minimize a statistical divergence between the reference distribution and the generative one in an adversarial setting. Some works trained instead generative networks to minimize Scoring Rules, functions assessing how well the generative distribution matches each training sample in-dividually. We show how the Scoring Rule formulation easily extends to the so-called prequential (predictive-sequential) score, whose minimization allows performing…
Figures and Tables from this paper
References
SHOWING 1-10 OF 56 REFERENCES
Generative Adversarial Nets
- Computer ScienceNIPS
- 2014
We propose a new framework for estimating generative models via an adversarial process, in which we simultaneously train two models: a generative model G that captures the data distribution, and a…
Generative Moment Matching Networks
- Computer ScienceICML
- 2015
This work forms a method that generates an independent sample via a single feedforward pass through a multilayer perceptron, as in the recently proposed generative adversarial networks, using MMD to learn to generate codes that can then be decoded to produce samples.
Machine Learning for Stochastic Parameterization: Generative Adversarial Networks in the Lorenz '96 Model
- Environmental Science, Computer ScienceJournal of Advances in Modeling Earth Systems
- 2020
This study develops a stochastic parameterization using the generative adversarial network (GAN) machine learning framework and finds that, in general, those models which produce skillful forecasts are also associated with the best climate simulations.
Training generative neural networks via Maximum Mean Discrepancy optimization
- Computer Science, MathematicsUAI
- 2015
This work considers training a deep neural network to generate samples from an unknown distribution given i.i.d. data to frame learning as an optimization minimizing a two-sample test statistic, and proves bounds on the generalization error incurred by optimizing the empirical MMD.
f-GAN: Training Generative Neural Samplers using Variational Divergence Minimization
- Computer ScienceNIPS
- 2016
It is shown that any f-divergence can be used for training generative neural samplers and the benefits of various choices of divergence functions on training complexity and the quality of the obtained generative models are discussed.
If You Like It, GAN It. Probabilistic Multivariate Times Series Forecast With GAN
- Computer ScienceEngineering Proceedings
- 2021
ProbCast is presented - a novel probabilistic model for multivariate time-series forecasting and a framework that lets us transform a deterministic model into a Probabilistic one with improved performance is proposed.
Improved Training of Wasserstein GANs
- Computer ScienceNIPS
- 2017
This work proposes an alternative to clipping weights: penalize the norm of gradient of the critic with respect to its input, which performs better than standard WGAN and enables stable training of a wide variety of GAN architectures with almost no hyperparameter tuning.
A generative adversarial network approach to (ensemble) weather prediction
- Environmental ScienceNeural Networks
- 2021
BayesFlow: Learning Complex Stochastic Models With Invertible Neural Networks
- Computer ScienceIEEE Transactions on Neural Networks and Learning Systems
- 2022
It is argued that BayesFlow provides a general framework for building amortized Bayesian parameter estimation machines for any forward model from which data can be simulated and is applicable to modeling scenarios where standard inference techniques with handcrafted summary statistics fail.
On GANs and GMMs
- Computer ScienceNeurIPS
- 2018
This paper presents a simple method to evaluate generative models based on relative proportions of samples that fall into predetermined bins, and shows that GMMs can generate realistic samples but also capture the full distribution, which GANs fail to do.