Enriched Pre-trained Transformers for Joint Slot Filling and Intent Detection
@article{Hardalov2020EnrichedPT, title={Enriched Pre-trained Transformers for Joint Slot Filling and Intent Detection}, author={Momchil Hardalov and Ivan Koychev and Preslav Nakov}, journal={ArXiv}, year={2020}, volume={abs/2004.14848} }
Detecting the user's intent and finding the corresponding slots among the utterance's words are important tasks in natural language understanding. Their interconnected nature makes their joint modeling a standard part of training such models. Moreover, data scarceness and specialized vocabularies pose additional challenges. Recently, the advances in pre-trained language models, namely contextualized models such as ELMo and BERT have revolutionized the field by tapping the potential of training… CONTINUE READING
Figures, Tables, and Topics from this paper
References
SHOWING 1-10 OF 41 REFERENCES
Joint Slot Filling and Intent Detection via Capsule Neural Networks
- Computer Science
- ACL
- 2019
- 71
- Highly Influential
- PDF
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
- Computer Science
- NAACL-HLT
- 2019
- 14,570
- Highly Influential
- PDF
A Novel Bi-directional Interrelated Model for Joint Intent Detection and Slot Filling
- Computer Science, Engineering
- ACL
- 2019
- 53
- PDF
Multi-Domain Joint Semantic Frame Parsing Using Bi-Directional RNN-LSTM
- Computer Science
- INTERSPEECH
- 2016
- 262
- Highly Influential
- PDF
Joint Multiple Intent Detection and Slot Labeling for Goal-Oriented Dialog
- Computer Science
- NAACL-HLT
- 2019
- 11
- PDF
Slot-Gated Modeling for Joint Slot Filling and Intent Prediction
- Computer Science
- NAACL-HLT
- 2018
- 154
- Highly Influential
- PDF
Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling
- Computer Science
- INTERSPEECH
- 2016
- 343
- PDF