Corpus ID: 60440543

(q, p)-Wasserstein GANs: Comparing Ground Metrics for Wasserstein GANs

@article{Mallasto2019qPG,
  title={(q, p)-Wasserstein GANs: Comparing Ground Metrics for Wasserstein GANs},
  author={Anton Mallasto and J. Frellsen and W. Boomsma and A. Feragen},
  journal={ArXiv},
  year={2019},
  volume={abs/1902.03642}
}
  • Anton Mallasto, J. Frellsen, +1 author A. Feragen
  • Published 2019
  • Computer Science, Mathematics
  • ArXiv
  • Generative Adversial Networks (GANs) have made a major impact in computer vision and machine learning as generative models. Wasserstein GANs (WGANs) brought Optimal Transport (OT) theory into GANs, by minimizing the $1$-Wasserstein distance between model and data distributions as their objective function. Since then, WGANs have gained considerable interest due to their stability and theoretical framework. We contribute to the WGAN literature by introducing the family of $(q,p)$-Wasserstein GANs… CONTINUE READING
    3 Citations
    How Well Do WGANs Estimate the Wasserstein Metric?
    • 5
    • PDF
    Entropy-Regularized 2-Wasserstein Distance between Gaussian Measures
    • 6
    • PDF
    Learning normalizing flows from Entropy-Kantorovich potentials
    • 2
    • PDF

    References

    SHOWING 1-10 OF 27 REFERENCES
    Wasserstein Divergence for GANs
    • 49
    • PDF
    Generative Modeling Using the Sliced Wasserstein Distance
    • 76
    • PDF
    Improved Training of Wasserstein GANs
    • 3,935
    • PDF
    GANs Trained by a Two Time-Scale Update Rule Converge to a Local Nash Equilibrium
    • 2,361
    • Highly Influential
    • PDF
    Wasserstein Auto-Encoders
    • 402
    • PDF
    Improved Techniques for Training GANs
    • 4,055
    • PDF
    Sinkhorn-AutoDiff: Tractable Wasserstein Learning of Generative Models
    • 48
    Generative Adversarial Nets
    • 20,850
    • PDF
    Learning Generative Models with Sinkhorn Divergences
    • 233
    • PDF
    Spectral Normalization for Generative Adversarial Networks
    • 1,757
    • PDF