A Neural Attention Model for Abstractive Sentence Summarization


Summarization based on text extraction is inherently limited, but generation-style abstractive methods have proven challenging to build. In this work, we propose a fully data-driven approach to abstractive sentence summarization. Our method utilizes a local attention-based model that generates each word of the summary conditioned on the input sentence. While the model is structurally simple, it can easily be trained end-to-end and scales to a large amount of training data. The model shows significant performance gains on the DUC-2004 shared task compared with several strong baselines.

View Slides

Extracted Key Phrases

7 Figures and Tables

Citations per Year

329 Citations

Semantic Scholar estimates that this publication has 329 citations based on the available data.

See our FAQ for additional information.

Cite this paper

@inproceedings{Rush2015ANA, title={A Neural Attention Model for Abstractive Sentence Summarization}, author={Alexander M. Rush and Sumit Chopra and Jason Weston}, booktitle={EMNLP}, year={2015} }