Using Recurrent Neural Networks for Slot Filling in Spoken Language Understanding

@article{Mesnil2015UsingRN,
  title={Using Recurrent Neural Networks for Slot Filling in Spoken Language Understanding},
  author={G. Mesnil and Yann Dauphin and K. Yao and Yoshua Bengio and L. Deng and D. Hakkani-Tur and X. He and Larry Heck and G. Tur and Dong Yu and G. Zweig},
  journal={IEEE/ACM Transactions on Audio, Speech, and Language Processing},
  year={2015},
  volume={23},
  pages={530-539}
}
  • G. Mesnil, Yann Dauphin, +8 authors G. Zweig
  • Published 2015
  • Computer Science
  • IEEE/ACM Transactions on Audio, Speech, and Language Processing
  • Semantic slot filling is one of the most challenging problems in spoken language understanding (SLU). In this paper, we propose to use recurrent neural networks (RNNs) for this task, and present several novel architectures designed to efficiently model past and future temporal dependencies. Specifically, we implemented and compared several important RNN architectures, including Elman, Jordan, and hybrid variants. To facilitate reproducibility, we implemented these networks with the publicly… CONTINUE READING
    Label-Dependency Coding in Simple Recurrent Networks for Spoken Language Understanding
    11
    ClockWork-RNN Based Architectures for Slot Filling
    Recurrent Neural Network Structured Output Prediction for Spoken Language Understanding
    40
    Modeling with Recurrent Neural Networks for Open Vocabulary Slots
    2
    A Comparison of Deep Learning Methods for Language Understanding
    2
    Label-Dependencies Aware Recurrent Neural Networks
    11
    Joint Online Spoken Language Understanding and Language Modeling With Recurrent Neural Networks
    53
    A Joint Model of Intent Determination and Slot Filling for Spoken Language Understanding
    110

    References

    Publications referenced by this paper.
    Semantic Frame‐Based Spoken Language Understanding
    38