Corpus ID: 211204927

Guiding attention in Sequence-to-sequence models for Dialogue Act prediction

@article{Colombo2020GuidingAI,
  title={Guiding attention in Sequence-to-sequence models for Dialogue Act prediction},
  author={Pierre Colombo and Emile Chapuis and Matteo Manica and Emmanuel Vignon and Giovanna Varni and C. Clavel},
  journal={ArXiv},
  year={2020},
  volume={abs/2002.08801}
}
  • Pierre Colombo, Emile Chapuis, +3 authors C. Clavel
  • Published in AAAI 2020
  • Computer Science
  • ArXiv
  • The task of predicting dialog acts (DA) based on conversational dialog is a key component in the development of conversational agents. Accurately predicting DAs requires a precise modeling of both the conversation and the global tag dependencies. We leverage seq2seq approaches widely adopted in Neural Machine Translation (NMT) to improve the modelling of tag sequentiality. Seq2seq models are known to learn complex global dependencies while currently proposed approaches using linear conditional… CONTINUE READING

    Citations

    Publications citing this paper.

    The Need to Move beyond Triples

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 38 REFERENCES

    Seq2seq Dependency Parsing

    VIEW 5 EXCERPTS
    HIGHLY INFLUENTIAL

    Sequence to Sequence Learning with Neural Networks

    VIEW 4 EXCERPTS
    HIGHLY INFLUENTIAL

    Deep Learning

    VIEW 2 EXCERPTS
    HIGHLY INFLUENTIAL

    Adam: A Method for Stochastic Optimization

    VIEW 1 EXCERPT
    HIGHLY INFLUENTIAL

    Learn - ing phrase representations using RNN encoder - decoder for statistical machine translation

    • K. Gimpel, D. Batra, C. Dyer, G. Shakhnarovich
    • 2013
    VIEW 1 EXCERPT
    HIGHLY INFLUENTIAL

    Dialog act modelling for conversational speech

    VIEW 3 EXCERPTS
    HIGHLY INFLUENTIAL