Deep Communicating Agents for Abstractive Summarization

@inproceedings{elikyilmaz2018DeepCA,
  title={Deep Communicating Agents for Abstractive Summarization},
  author={Asli Çelikyilmaz and Antoine Bosselut and Xiaodong He and Yejin Choi},
  booktitle={NAACL-HLT},
  year={2018}
}
We present deep communicating agents in an encoder-decoder architecture to address the challenges of representing a long document for abstractive summarization. With deep communicating agents, the task of encoding a long text is divided across multiple collaborating agents, each in charge of a subsection of the input text. These encoders are connected to a single decoder, trained end-to-end using reinforcement learning to generate a focused and coherent summary. Empirical results demonstrate… CONTINUE READING
Related Discussions
This paper has been referenced on Twitter 41 times. VIEW TWEETS

From This Paper

Figures, tables, and topics from this paper.

Citations

Publications citing this paper.

References

Publications referenced by this paper.
Showing 1-10 of 38 references

Controllable Abstractive Summarization

View 6 Excerpts
Highly Influenced

Teaching Machines to Read and Comprehend

View 7 Excerpts
Highly Influenced

2017; Hermann et al., 2015) is a collection of online news articles along with multisentence summaries. We use the same data splits as in Nallapati et al. (2017)

lapati
2017

Similar Papers

Loading similar papers…