Corpus ID: 216868791

Enriched Pre-trained Transformers for Joint Slot Filling and Intent Detection

@article{Hardalov2020EnrichedPT,
  title={Enriched Pre-trained Transformers for Joint Slot Filling and Intent Detection},
  author={Momchil Hardalov and Ivan Koychev and Preslav Nakov},
  journal={ArXiv},
  year={2020},
  volume={abs/2004.14848}
}
  • Momchil Hardalov, Ivan Koychev, Preslav Nakov
  • Published 2020
  • Computer Science
  • ArXiv
  • Detecting the user's intent and finding the corresponding slots among the utterance's words are important tasks in natural language understanding. Their interconnected nature makes their joint modeling a standard part of training such models. Moreover, data scarceness and specialized vocabularies pose additional challenges. Recently, the advances in pre-trained language models, namely contextualized models such as ELMo and BERT have revolutionized the field by tapping the potential of training… CONTINUE READING

    References

    SHOWING 1-10 OF 41 REFERENCES
    BERT for Joint Intent Classification and Slot Filling
    • 100
    • PDF
    Joint Slot Filling and Intent Detection via Capsule Neural Networks
    • 71
    • Highly Influential
    • PDF
    BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
    • 14,570
    • Highly Influential
    • PDF
    A Novel Bi-directional Interrelated Model for Joint Intent Detection and Slot Filling
    • 53
    • PDF
    Improving Language Understanding by Generative Pre-Training
    • 1,804
    • PDF
    Multi-Domain Joint Semantic Frame Parsing Using Bi-Directional RNN-LSTM
    • 262
    • Highly Influential
    • PDF
    Joint Multiple Intent Detection and Slot Labeling for Goal-Oriented Dialog
    • 11
    • PDF
    Slot-Gated Modeling for Joint Slot Filling and Intent Prediction
    • 154
    • Highly Influential
    • PDF
    Language Models are Unsupervised Multitask Learners
    • 2,452
    • PDF
    Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling
    • 343
    • PDF