Jointly Extracting and Compressing Documents with Summary State Representations

@article{Mendes2019JointlyEA,
  title={Jointly Extracting and Compressing Documents with Summary State Representations},
  author={Afonso Mendes and Shashi Narayan and Sebasti{\~a}o Miranda and Zita Marinho and Andr{\'e} F. T. Martins and Shay B. Cohen},
  journal={ArXiv},
  year={2019},
  volume={abs/1904.02020}
}
We present a new neural model for text summarization that first extracts sentences from a document and then compresses them. The pro-posed model offers a balance that sidesteps thedifficulties in abstractive methods while gener-ating more concise summaries than extractivemethods. In addition, our model dynamically determines the length of the output summary based on the gold summaries it observes during training and does not require length constraints typical to extractive summarization. The… Expand
Neural Extractive Text Summarization with Syntactic Compression
TLDR
This work presents a neural model for single-document summarization based on joint extraction and syntactic compression that outperforms an off-the-shelf compression module, and human and manual evaluation shows that the model’s output generally remains grammatical. Expand
On Extractive and Abstractive Neural Document Summarization with Transformer Language Models
TLDR
A simple extractive step is performed before generating a summary, which is then used to condition the transformer language model on relevant information before being tasked with Generating a summary. Expand
About ? Extreme Summarization with Topic-Aware Convolutional Neural Networks
We introduce extreme summarization, a new single-document summarization task which aims at creating a short, one-sentence news summary answering the question “What is the article about?”. We argueExpand
What is this Article about? Extreme Summarization with Topic-aware Convolutional Neural Networks
TLDR
A novel abstractive model is proposed which is conditioned on the article's topics and based entirely on convolutional neural networks, outperforming an oracle extractive system and state-of-the-art abstractive approaches when evaluated automatically and by humans on the extreme summarization dataset. Expand
EASE: Extractive-Abstractive Summarization End-to-End using the Information Bottleneck Principle
Current abstractive summarization systems outperform their extractive counterparts, but their widespread adoption is inhibited by the inherent lack of interpretability. Extractive summarizationExpand
EASE: Extractive-Abstractive Summarization with Explanations
TLDR
This work presents an explainable summarization system based on the Information Bottleneck principle that is jointly trained for extraction and abstraction in an end-to-end fashion and shows that explanations from this framework are more relevant than simple baselines, without substantially sacrificing the quality of the generated summary. Expand
Multiplex Graph Neural Network for Extractive Text Summarization
TLDR
This work proposes a novel Multiplex Graph Convolutional Network (MultiGCN) to jointly model different types of relationships among sentences and words and proposes a Multi-GraS model for extractive text summarization. Expand
Copy or Rewrite: Hybrid Summarization with Hierarchical Reinforcement Learning
TLDR
This work proposes HySum, a hybrid framework for summarization that can flexibly switch between copying sentence and rewriting sentence according to the degree of redundancy, and proposes an end-to-end reinforcing method to bridge together the extraction module and rewriting module, which can enhance the cooperation between them. Expand
Neural Extractive Summarization with Hierarchical Attentive Heterogeneous Graph Network
TLDR
This paper proposes HAHSum (as shorthand for Hierarchical Attentive Heterogeneous Graph for Text Summarization), which well models different levels of information, including words and sentences, and spotlights redundancy dependencies between sentences. Expand
Extractive Summarization as Text Matching
TLDR
This paper forms the extractive summarization task as a semantic text matching problem, in which a source document and candidate summaries will be matched in a semantic space to create a semantic matching framework. Expand
...
1
2
3
...

References

SHOWING 1-10 OF 66 REFERENCES
Jointly Learning to Extract and Compress
TLDR
A joint model of sentence extraction and compression for multi-document summarization and its jointly extracted and compressed summaries outperform both unlearned baselines and the authors' learned extraction-only system on both ROUGE and Pyramid, without a drop in judged linguistic quality. Expand
Don’t Give Me the Details, Just the Summary! Topic-Aware Convolutional Neural Networks for Extreme Summarization
TLDR
A novel abstractive model is proposed which is conditioned on the article’s topics and based entirely on convolutional neural networks, outperforming an oracle extractive system and state-of-the-art abstractive approaches when evaluated automatically and by humans. Expand
Classify or Select: Neural Architectures for Extractive Document Summarization
TLDR
Two novel and contrasting Recurrent Neural Network (RNN) based architectures for extractive summarization of documents are presented and the models under both architectures jointly capture the notions of salience and redundancy of sentences. Expand
Neural Summarization by Extracting Sentences and Words
TLDR
This work develops a general framework for single-document summarization composed of a hierarchical document encoder and an attention-based extractor that allows for different classes of summarization models which can extract sentences or words. Expand
Abstractive Document Summarization with a Graph-Based Attentional Neural Model
TLDR
A novel graph-based attention mechanism in the sequence-to-sequence framework to address the saliency factor of summarization, which has been overlooked by prior works and is competitive with state-of-the-art extractive methods. Expand
Bottom-Up Abstractive Summarization
TLDR
This work explores the use of data-efficient content selectors to over-determine phrases in a source document that should be part of the summary, and shows that this approach improves the ability to compress text, while still generating fluent summaries. Expand
Fast Abstractive Summarization with Reinforce-Selected Sentence Rewriting
TLDR
An accurate and fast summarization model that first selects salient sentences and then rewrites them abstractively to generate a concise overall summary is proposed, which achieves the new state-of-the-art on all metrics on the CNN/Daily Mail dataset, as well as significantly higher abstractiveness scores. Expand
Neural Latent Extractive Document Summarization
TLDR
A latent variable extractive model, where sentences are viewed as latent variables and sentences with activated variables are used to infer gold summaries, which outperforms a strong extractive baseline trained on rule-based labels and performs competitively with several recent models. Expand
A Deep Reinforced Model for Abstractive Summarization
TLDR
A neural network model with a novel intra-attention that attends over the input and continuously generated output separately, and a new training method that combines standard supervised word prediction and reinforcement learning (RL) that produces higher quality summaries. Expand
Ranking Sentences for Extractive Summarization with Reinforcement Learning
TLDR
This paper conceptualize extractive summarization as a sentence ranking task and proposes a novel training algorithm which globally optimizes the ROUGE evaluation metric through a reinforcement learning objective. Expand
...
1
2
3
4
5
...