• Computer Science
  • Published in NAACL-HLT 2018

Evaluating Text GANs as Language Models

@inproceedings{Tevet2018EvaluatingTG,
  title={Evaluating Text GANs as Language Models},
  author={Guy Tevet and Gavriel Habib and Vered Shwartz and Jonathan Berant},
  booktitle={NAACL-HLT},
  year={2018}
}
Generative Adversarial Networks (GANs) are a promising approach for text generation that, unlike traditional language models (LM), does not suffer from the problem of "exposure bias. [...] Key Method We apply our approximation procedure on several GAN-based models and show that they currently perform substantially worse than state-of-the-art LMs. Our evaluation procedure promotes better understanding of the relation between GANs and LMs, and can accelerate progress in GAN-based text generation.Expand Abstract
38
Twitter Mentions

Figures, Tables, and Topics from this paper.

Citations

Publications citing this paper.
SHOWING 1-10 OF 10 CITATIONS

ELECTRA: PRE-TRAINING TEXT ENCODERS

VIEW 2 EXCERPTS
CITES BACKGROUND & METHODS

QUANTIFYING EXPOSURE BIAS

  • 2019
VIEW 1 EXCERPT
CITES METHODS

The Curious Case of Neural Text Degeneration

VIEW 1 EXCERPT
CITES BACKGROUND

Training language GANs from Scratch

VIEW 1 EXCERPT
CITES BACKGROUND

A Tutorial on Deep Latent Variable Models of Natural Language

VIEW 1 EXCERPT
CITES BACKGROUND

References

Publications referenced by this paper.
SHOWING 1-10 OF 32 REFERENCES

SeqGAN: Sequence Generative Adversarial Nets with Policy Gradient

VIEW 14 EXCERPTS
HIGHLY INFLUENTIAL

Adversarial Generation of Natural Language

VIEW 7 EXCERPTS
HIGHLY INFLUENTIAL

GANs Trained by a Two Time-Scale Update Rule Converge to a Nash Equilibrium

VIEW 3 EXCERPTS
HIGHLY INFLUENTIAL

Improved Training of Wasserstein GANs

VIEW 6 EXCERPTS
HIGHLY INFLUENTIAL

Long Text Generation via Adversarial Training with Leaked Information

VIEW 9 EXCERPTS
HIGHLY INFLUENTIAL

proved training of wasserstein gans

  • Jiaxian Guo, Sidi Lu, +3 authors Jun Wang
  • Advances in Neural Information Processing Systems
  • 2017
VIEW 6 EXCERPTS
HIGHLY INFLUENTIAL

Language GANs Falling Short

VIEW 2 EXCERPTS