Towards Unsupervised Language Understanding and Generation by Joint Dual Learning

@article{Su2020TowardsUL,
  title={Towards Unsupervised Language Understanding and Generation by Joint Dual Learning},
  author={Shang-Yu Su and Chao-Wei Huang and Y. Chen},
  journal={ArXiv},
  year={2020},
  volume={abs/2004.14710}
}
In modular dialogue systems, natural language understanding (NLU) and natural language generation (NLG) are two critical components, where NLU extracts the semantics from the given texts and NLG is to construct corresponding natural language sentences based on the input semantic representations. However, the dual property between understanding and generation has been rarely explored. The prior work is the first attempt that utilized the duality between NLU and NLG to improve the performance via… Expand
8 Citations
Improving Limited Labeled Dialogue State Tracking with Self-Supervision
Neural Data-to-Text Generation with LM-based Text Augmentation
LearnDA: Learnable Knowledge-Guided Data Augmentation for Event Causality Identification
Dual Inference for Improving Language Understanding and Generation
Dual Learning
Dual Learning for Machine Translation and Beyond
Dual Learning for Semi-Supervised Natural Language Understanding

References

SHOWING 1-10 OF 31 REFERENCES
Dual Supervised Learning for Natural Language Understanding and Generation
Stochastic Language Generation in Dialogue using Recurrent Neural Networks with Convolutional Sentence Reranking
Natural Language Generation by Hierarchical Decoding with Linguistic Patterns
Dual Learning for Machine Translation
Dual Supervised Learning
Contextual spoken language understanding using recurrent neural networks
Multi-Domain Joint Semantic Frame Parsing Using Bi-Directional RNN-LSTM
A Network-based End-to-End Trainable Task-oriented Dialogue System
...
1
2
3
4
...