Corpus ID: 76665205

Consistent Dialogue Generation with Self-supervised Feature Learning

@article{Zhang2019ConsistentDG,
  title={Consistent Dialogue Generation with Self-supervised Feature Learning},
  author={Yizhe Zhang and Xiang Gao and Sungjin Lee and Chris Brockett and Michel Galley and Jianfeng Gao and W. Dolan},
  journal={ArXiv},
  year={2019},
  volume={abs/1903.05759}
}
Generating responses that are consistent with the dialogue context is one of the central challenges in building engaging conversational agents. [...] Key Method Unlike past work that requires external supervision such as user identities, which are often unavailable or classified as sensitive information, our approach trains topic and persona feature extractors in a self-supervised way by utilizing the natural structure of dialogue data.Expand
DialoGPT: Large-Scale Generative Pre-training for Conversational Response Generation
Conversing by Reading: Contentful Neural Conversation with On-demand Machine Reading
Challenges in Building Intelligent Open-domain Dialog Systems
Structuring Latent Spaces for Stylized Response Generation
Initiative-Aware Self-Supervised Learning for Knowledge-Grounded Conversations
Jointly Optimizing Diversity and Relevance in Neural Response Generation
...
1
2
...

References

SHOWING 1-10 OF 53 REFERENCES
DialoGPT: Large-Scale Generative Pre-training for Conversational Response Generation
A Hierarchical Latent Variable Encoder-Decoder Model for Generating Dialogues
Multi-Task Learning for Speaker-Role Adaptation in Neural Conversation Models
Long Text Generation via Adversarial Training with Leaked Information
Generating Informative Responses with Controlled Sentence Function
A Diversity-Promoting Objective Function for Neural Conversation Models
Topic Aware Neural Response Generation
Steering Output Style and Topic in Neural Response Generation
...
1
2
3
4
5
...