Corpus ID: 8824994

Knowledge as a Teacher: Knowledge-Guided Structural Attention Networks

@article{Chen2016KnowledgeAA,
  title={Knowledge as a Teacher: Knowledge-Guided Structural Attention Networks},
  author={Yun-Nung Chen and Dilek Z. Hakkani-T{\"u}r and G{\"o}khan T{\"u}r and Asli Çelikyilmaz and Jianfeng Gao and Li Deng},
  journal={ArXiv},
  year={2016},
  volume={abs/1609.03286}
}
  • Yun-Nung Chen, Dilek Z. Hakkani-Tür, +3 authors Li Deng
  • Published 2016
  • Computer Science
  • ArXiv
  • Natural language understanding (NLU) is a core component of a spoken dialogue system. Recently recurrent neural networks (RNN) obtained strong results on NLU due to their superior ability of preserving sequential information over time. Traditionally, the NLU module tags semantic slots for utterances considering their flat structures, as the underlying RNN structure is a linear chain. However, natural language exhibits linguistic properties that provide rich, structured information for better… CONTINUE READING

    Figures, Tables, and Topics from this paper.

    Citations

    Publications citing this paper.
    SHOWING 1-10 OF 25 CITATIONS

    Syntax or semantics? knowledge-guided joint semantic frame parsing

    VIEW 2 EXCERPTS
    CITES METHODS & BACKGROUND

    Contextual and Structural Language Understanding in Dialogues

    • Yun-Nung
    • 2017
    VIEW 4 EXCERPTS
    CITES METHODS
    HIGHLY INFLUENCED

    Encoding Module ( you ) tell vivian be quietto ROOT ( imperative )

    VIEW 8 EXCERPTS
    CITES METHODS & BACKGROUND

    End-to-end joint learning of natural language understanding and dialogue manager

    Domain Transfer for Deep Natural Language Generation from Abstract Meaning Representations

    • Nina Dethlefs
    • Computer Science
    • IEEE Computational Intelligence Magazine
    • 2017
    VIEW 1 EXCERPT
    CITES BACKGROUND

    Knowledge-aware Attentive Neural Network for Ranking Question Answer Pairs

    VIEW 1 EXCERPT
    CITES BACKGROUND

    FILTER CITATIONS BY YEAR

    2016
    2020

    CITATION STATISTICS

    • 1 Highly Influenced Citations

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 51 REFERENCES

    Using Recurrent Neural Networks for Slot Filling in Spoken Language Understanding

    VIEW 9 EXCERPTS

    Spoken language understanding using long short-term memory neural networks

    VIEW 1 EXCERPT

    Memory Networks

    VIEW 1 EXCERPT