Deep Communicating Agents for Abstractive Summarization

@inproceedings{elikyilmaz2018DeepCA,
  title={Deep Communicating Agents for Abstractive Summarization},
  author={Asli Çelikyilmaz and Antoine Bosselut and Xiaodong He and Yejin Choi},
  booktitle={NAACL-HLT},
  year={2018}
}
We present deep communicating agents in an encoder-decoder architecture to address the challenges of representing a long document for abstractive summarization. With deep communicating agents, the task of encoding a long text is divided across multiple collaborating agents, each in charge of a subsection of the input text. These encoders are connected to a single decoder, trained end-to-end using reinforcement learning to generate a focused and coherent summary. Empirical results demonstrate… CONTINUE READING

Topics from this paper.

Citations

Publications citing this paper.
SHOWING 1-10 OF 58 CITATIONS

A Robust Abstractive System for Cross-Lingual Summarization

VIEW 12 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Automatic Structured Text Summarization with Concept Maps

VIEW 6 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Multi-Agent Discussion Mechanism for Natural Language Generation

VIEW 6 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Bottom-Up Abstractive Summarization

VIEW 4 EXCERPTS
CITES RESULTS & BACKGROUND
HIGHLY INFLUENCED

Bidirectional Context-Aware Hierarchical Attention Network for Document Understanding

Jean-Baptiste Remy, Antoine Jean-Pierre Tixier, Michalis Vazirgiannis
  • ArXiv
  • 2019
VIEW 3 EXCERPTS
CITES BACKGROUND
HIGHLY INFLUENCED

FILTER CITATIONS BY YEAR

2018
2019

CITATION STATISTICS

  • 13 Highly Influenced Citations

  • Averaged 29 Citations per year from 2018 through 2019

  • 214% Increase in citations per year in 2019 over 2018

References

Publications referenced by this paper.
SHOWING 1-10 OF 38 REFERENCES