Corpus ID: 326772

GANs Trained by a Two Time-Scale Update Rule Converge to a Local Nash Equilibrium

@inproceedings{Heusel2017GANsTB,
  title={GANs Trained by a Two Time-Scale Update Rule Converge to a Local Nash Equilibrium},
  author={Martin Heusel and Hubert Ramsauer and Thomas Unterthiner and Bernhard Nessler and S. Hochreiter},
  booktitle={NIPS},
  year={2017}
}
Generative Adversarial Networks (GANs) excel at creating realistic images with complex models for which maximum likelihood is infeasible. [...] Key Method Using the theory of stochastic approximation, we prove that the TTUR converges under mild assumptions to a stationary local Nash equilibrium. The convergence carries over to the popular Adam optimization, for which we prove that it follows the dynamics of a heavy ball with friction and thus prefers flat minima in the objective landscape.Expand
On the Convergence and Robustness of Training GANs with Regularized Optimal Transport
COULOMB GANS: PROVABLY OPTIMAL NASH EQUI-
An Online Learning Approach to Generative Adversarial Networks
Solving Approximate Wasserstein GANs to Stationarity
Variational Bayesian GAN
BEGAN v3: Avoiding Mode Collapse in GANs Using Variational Inference
First Order Generative Adversarial Networks
Online Adaptative Curriculum Learning for GANs
Towards a Better Understanding and Regularization of GAN Training Dynamics
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 72 REFERENCES
Gradient descent GAN optimization is locally stable
An Online Learning Approach to Generative Adversarial Networks
Improved Training of Wasserstein GANs
Fisher GAN
Towards Understanding the Dynamics of Generative Adversarial Networks
Boundary-Seeking Generative Adversarial Networks
Approximation and Convergence Properties of Generative Adversarial Learning
Generative Adversarial Nets
MMD GAN: Towards Deeper Understanding of Moment Matching Network
AdaGAN: Boosting Generative Models
...
1
2
3
4
5
...