A Fast Attention Network for Joint Intent Detection and Slot Filling on Edge Devices

@article{Huang2022AFA,
  title={A Fast Attention Network for Joint Intent Detection and Slot Filling on Edge Devices},
  author={Liang Huang and Senjie Liang and Feiyang Ye and Nan Gao},
  journal={ArXiv},
  year={2022},
  volume={abs/2205.07646}
}
—Intent detection and slot filling are two main tasks in natural language understanding and play an essential role in task-oriented dialogue systems. The joint learning of both tasks can improve inference accuracy and is popular in recent works. However, most joint models ignore the inference latency and cannot meet the need to deploy dialogue systems at the edge. In this paper, we propose a Fast Attention Network (FAN) for joint intent detection and slot filling tasks, guaranteeing both accuracy… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 45 REFERENCES
Joint Intent Detection and Slot Filling with Wheel-Graph Attention Networks
TLDR
This work proposes a new joint model with a wheel-graph attention network (Wheel-GAT), which is able to model interrelated connections directly for single intent detection and slot filling and demonstrates that this approach is superior to multiple baselines on ATIS and SNIPS datasets.
A Co-Interactive Transformer for Joint Slot Filling and Intent Detection
TLDR
This paper proposes a co-interactive module to consider the cross-impact by building a bidirectional connection between the two related tasks, where slot and intent can be able to attend on the corresponding mutual information.
A Self-Attentive Model with Gate Mechanism for Spoken Language Understanding
TLDR
A novel self-attentive model with gate mechanism to fully utilize the semantic correlation between slot and intent and outperforms other popular methods by a large margin in terms of both intent detection error rate and slot filling F1-score is proposed.
A Stack-Propagation Framework with Token-Level Intent Detection for Spoken Language Understanding
TLDR
A novel framework for SLU to better incorporate the intent information, which further guiding the slot filling is proposed, which achieves the state-of-the-art performance and outperforms other previous methods by a large margin.
A Label-Aware BERT Attention Network for Zero-Shot Multi-Intent Detection in Spoken Language Understanding
TLDR
A Label-Aware BERT Attention Network (LABAN) for zero-shot multi-intent detection and shows that it successfully extends to few/zero-shot setting where part of intent labels are unseen in training data, by also taking account of semantics in these unseen intent labels.
Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling
TLDR
This work proposes an attention-based neural network model for joint intent detection and slot filling, both of which are critical steps for many speech understanding and dialog systems.
A Novel Bi-directional Interrelated Model for Joint Intent Detection and Slot Filling
TLDR
A novel bi-directional interrelated model for joint intent detection and slot filling is proposed and an SF-ID network is introduced to establish direct connections for the two tasks to help them promote each other mutually.
A Joint Learning Framework With BERT for Spoken Language Understanding
TLDR
A novel encoder-decoder framework based multi-task learning model, which conducts joint training for intent classification and slot filling tasks and outperforms the state-of-the-art approaches.
A Joint Model of Intent Determination and Slot Filling for Spoken Language Understanding
TLDR
A joint model is proposed based on the idea that the intent and semantic slots of a sentence are correlative, and it outperforms the state-of-the-art approaches on both tasks.
BERT for Joint Intent Classification and Slot Filling
TLDR
This work proposes a joint intent classification and slot filling model based on BERT that achieves significant improvement on intent classification accuracy, slot filling F1, and sentence-level semantic frame accuracy on several public benchmark datasets, compared to the attention-based recurrent neural network models and slot-gated models.
...
...