Corpus ID: 4524968

Neural Extractive Summarization with Side Information

@article{Narayan2017NeuralES,
  title={Neural Extractive Summarization with Side Information},
  author={Shashi Narayan and Nikos Papasarantopoulos and Mirella Lapata and Shay B. Cohen},
  journal={ArXiv},
  year={2017},
  volume={abs/1704.04530}
}
Most extractive summarization methods focus on the main body of the document from which sentences need to be extracted. However, the gist of the document may lie in side information, such as the title and image captions which are often available for newswire articles. We propose to explore side information in the context of single-document extractive summarization. We develop a framework for single-document summarization composed of a hierarchical document encoder and an attention-based… Expand
Sentence Selective Neural Extractive Summarization with Reinforcement Learning
  • Laifu Chen, M. Nguyen
  • Computer Science
  • 2019 11th International Conference on Knowledge and Systems Engineering (KSE)
  • 2019
TLDR
A sentence level selective encoding mechanism to select important feature before extracting sentences is developed, and a novel reinforcement learning based training algorithm is used to extend the sequence model. Expand
Ranking Sentences for Extractive Summarization with Reinforcement Learning
TLDR
This paper conceptualize extractive summarization as a sentence ranking task and proposes a novel training algorithm which globally optimizes the ROUGE evaluation metric through a reinforcement learning objective. Expand
Memory-Based Extractive Summarization
TLDR
A memory-based extractive summarization (MES) model which is mainly constructed by memory generalization and sentence extractor is proposed which can store more information of features extracted from sentences, relationships between sentences and implications of document, thus giving richer representations for selecting sentences for summary. Expand
An Extraction-Abstraction Hybrid Approach for Long Document Summarization
TLDR
A hybrid model of extractive and abstractive methods to tackle the long document automatic summarization task and the ROUGE score of this model exceeds the score of the state-of-the-art model in the original NLPCC2017 Shared Task 3. Expand
About ? Extreme Summarization with Topic-Aware Convolutional Neural Networks
We introduce extreme summarization, a new single-document summarization task which aims at creating a short, one-sentence news summary answering the question “What is the article about?”. We argueExpand
Reading Like HER: Human Reading Inspired Extractive Summarization
TLDR
This work re-examine the problem of extractive text summarization for long documents as a contextual-bandit problem and solves it with policy gradient, adopting a convolutional neural network to encode gist of paragraphs for rough reading, and a decision making policy with an adapted termination mechanism for careful reading. Expand
Transformer-based Model for Single Documents Neural Summarization
TLDR
A framework that encodes the source text first with a transformer, then a sequence-to-sequence (seq2seq) model, which finds that the transformer and seq2seq model complement themselves adequately, making for a richer encoded vector representation. Expand
Keyphrase Guided Beam Search for Neural Abstractive Text Summarization
TLDR
This work devise a novel structure of convolutional recurrent neural network-based encoder to get a better latent representation of the source text and propose to exploit keyphrases to guide the summary selection in a modified beam search process, thus contributing to a closer semantic relevance between the sourceText and the generated summary. Expand
Inducing Document Structure for Aspect-based Summarization
TLDR
It is shown that the benefit of the learnt document structure can leverage the structure to produce both abstractive and extractive aspect-based summaries, and that structure is particularly advantageous for summarizing long documents. Expand
Neural sentence fusion for diversity driven abstractive multi-document summarization
TLDR
This work designs complementary models for two different tasks such as sentence clustering and neural sentence fusion and applies them to implement a full abstractive multi-document summarization system which simultaneously considers importance, coverage, and diversity under a desired length limit. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 52 REFERENCES
Neural Summarization by Extracting Sentences and Words
TLDR
This work develops a general framework for single-document summarization composed of a hierarchical document encoder and an attention-based extractor that allows for different classes of summarization models which can extract sentences or words. Expand
SummaRuNNer: A Recurrent Neural Network Based Sequence Model for Extractive Summarization of Documents
We present SummaRuNNer, a Recurrent Neural Network (RNN) based sequence model for extractive summarization of documents and show that it achieves performance better than or comparable toExpand
A Neural Attention Model for Abstractive Sentence Summarization
TLDR
This work proposes a fully data-driven approach to abstractive sentence summarization by utilizing a local attention-based model that generates each word of the summary conditioned on the input sentence. Expand
Event-Based Extractive Summarization
TLDR
The experimental results indicate that not only the event-based features offer an improvement in summary quality over words as features, but that this effect is more pronounced for more sophisticated summarization methods that avoid redundancy in the output. Expand
Enhancing Single-Document Summarization by Combining RankNet and Third-Party Sources
We present a new approach to automatic summarization based on neural nets, called NetSum. We extract a set of features from each sentence that helps identify its importance in the document. We applyExpand
Abstractive Document Summarization with a Graph-Based Attentional Neural Model
TLDR
A novel graph-based attention mechanism in the sequence-to-sequence framework to address the saliency factor of summarization, which has been overlooked by prior works and is competitive with state-of-the-art extractive methods. Expand
Extractive Summarization using Continuous Vector Space Models
TLDR
This paper proposes the use of continuous vector representations for semantically aware representations of sentences as a basis for measuring similarity and evaluates different compositions for sentence representation on a standard dataset using the ROUGE evaluation measures. Expand
Abstractive Text Summarization using Sequence-to-sequence RNNs and Beyond
TLDR
This work proposes several novel models that address critical problems in summarization that are not adequately modeled by the basic architecture, such as modeling key-words, capturing the hierarchy of sentence-to-word structure, and emitting words that are rare or unseen at training time. Expand
Optimizing Sentence Modeling and Selection for Document Summarization
TLDR
This paper attempts to build a strong summarizer DivSelect+CNNLM by presenting new algorithms to optimize each of them, and proposes CNNLM, a novel neural network language model (NNLM) based on convolutional neural network (CNN), to project sentences into dense distributed representations, then models sentence redundancy by cosine similarity. Expand
Classify or Select: Neural Architectures for Extractive Document Summarization
TLDR
Two novel and contrasting Recurrent Neural Network (RNN) based architectures for extractive summarization of documents are presented and the models under both architectures jointly capture the notions of salience and redundancy of sentences. Expand
...
1
2
3
4
5
...