A Neural Attention Model for Abstractive Sentence Summarization


Summarization based on text extraction is inherently limited, but generation-style ab-stractive methods have proven challenging to build. In this work, we propose a fully data-driven approach to abstrac-tive sentence summarization. Our method utilizes a local attention-based model that generates each word of the summary conditioned on the input sentence. While the model is structurally simple, it can easily be trained end-to-end and scales to a large amount of training data. The model shows significant performance gains on the DUC-2004 shared task compared with several strong baselines.

Extracted Key Phrases

7 Figures and Tables

Showing 1-10 of 188 extracted citations
Citations per Year

247 Citations

Semantic Scholar estimates that this publication has received between 207 and 301 citations based on the available data.

See our FAQ for additional information.