A Neural Attention Model for Abstractive Sentence Summarization

@article{Rush2015ANA,
  title={A Neural Attention Model for Abstractive Sentence Summarization},
  author={Alexander M. Rush and S. Chopra and J. Weston},
  journal={ArXiv},
  year={2015},
  volume={abs/1509.00685}
}
Summarization based on text extraction is inherently limited, but generation-style abstractive methods have proven challenging to build. [...] Key Method Our method utilizes a local attention-based model that generates each word of the summary conditioned on the input sentence. While the model is structurally simple, it can easily be trained end-to-end and scales to a large amount of training data. The model shows significant performance gains on the DUC-2004 shared task compared with several strong baselines.Expand

Figures, Tables, and Topics from this paper

Neural Abstractive Text Summarization
A Discourse-Aware Attention Model for Abstractive Summarization of Long Documents
Explorer Neural Latent Extractive Document Summarization
A Hierarchical Model for Text Autosummarization
Summary Level Training of Sentence Rewriting for Abstractive Summarization
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 44 REFERENCES
Summarization beyond sentence extraction: A probabilistic approach to sentence compression
Headline Generation Based on Statistical Translation
Using Hidden Markov Modeling to Decompose Human-Written Summaries
Sequence to Sequence Learning with Neural Networks
Sentence Compression Beyond Word Deletion
Sentence Simplification by Monolingual Machine Translation
Recurrent Continuous Translation Models
Global inference for sentence compression : an integer linear programming approach
...
1
2
3
4
5
...