A Neural Attention Model for Abstractive Sentence Summarization

@inproceedings{Rush2015ANA,
  title={A Neural Attention Model for Abstractive Sentence Summarization},
  author={Alexander M. Rush and Sumit Chopra and Jason Weston},
  booktitle={EMNLP},
  year={2015}
}
Summarization based on text extraction is inherently limited, but generation-style abstractive methods have proven challenging to build. In this work, we propose a fully data-driven approach to abstractive sentence summarization. Our method utilizes a local attention-based model that generates each word of the summary conditioned on the input sentence. While the model is structurally simple, it can easily be trained end-to-end and scales to a large amount of training data. The model shows… CONTINUE READING
Highly Influential
This paper has highly influenced 88 other papers. REVIEW HIGHLY INFLUENTIAL CITATIONS
Highly Cited
This paper has 748 citations. REVIEW CITATIONS
Recent Discussions
This paper has been referenced on Twitter 56 times over the past 90 days. VIEW TWEETS

From This Paper

Figures, tables, and topics from this paper.
519 Citations
27 References
Similar Papers

Citations

Publications citing this paper.

748 Citations

02004002015201620172018
Citations per Year
Semantic Scholar estimates that this publication has 748 citations based on the available data.

See our FAQ for additional information.

Similar Papers

Loading similar papers…