Collaborative Training of Gans in Continuous and Discrete Spaces for Text Generation

@article{Kim2020CollaborativeTO,
  title={Collaborative Training of Gans in Continuous and Discrete Spaces for Text Generation},
  author={Yanghoon Kim and Seungpil Won and Seunghyun Yoon and K. Jung},
  journal={IEEE Access},
  year={2020},
  volume={8},
  pages={226515-226523}
}
Applying generative adversarial networks (GANs) to text-related tasks is challenging due to the discrete nature of language. One line of research resolves this issue by employing reinforcement learning (RL) and optimizing the next-word sampling policy directly in a discrete action space. Such methods compute the rewards from complete sentences and avoid error accumulation due to exposure bias. Other approaches employ approximation techniques that map the text to continuous representation in… Expand

References

SHOWING 1-10 OF 39 REFERENCES
Adversarially Regularized Autoencoders
  • 169
  • PDF
Long Text Generation via Adversarial Training with Leaked Information
  • 230
  • PDF
Toward Diverse Text Generation with Inverse Reinforcement Learning
  • 28
  • PDF
Adversarial Feature Matching for Text Generation
  • 205
  • PDF
Maximum-Likelihood Augmented Discrete Generative Adversarial Networks
  • 151
  • PDF
Adversarial Ranking for Language Generation
  • 194
  • Highly Influential
  • PDF
Latent Code and Text-based Generative Adversarial Networks for Soft-text Generation
  • 4
  • PDF
Training language GANs from Scratch
  • 30
  • Highly Influential
  • PDF
...
1
2
3
4
...