Corpus ID: 13748058

Towards a Neural Network Approach to Abstractive Multi-Document Summarization

@article{Zhang2018TowardsAN,
  title={Towards a Neural Network Approach to Abstractive Multi-Document Summarization},
  author={Jianmin Zhang and Jiwei Tan and Xiaojun Wan},
  journal={ArXiv},
  year={2018},
  volume={abs/1804.09010}
}
Till now, neural abstractive summarization methods have achieved great success for single document summarization (SDS. [...] Key Method Our approach only makes use of a small number of multi-document summaries for fine tuning. Experimental results on two benchmark DUC datasets demonstrate that our approach can outperform a variety of baseline neural models.Expand
Neural sentence fusion for diversity driven abstractive multi-document summarization
TLDR
This work designs complementary models for two different tasks such as sentence clustering and neural sentence fusion and applies them to implement a full abstractive multi-document summarization system which simultaneously considers importance, coverage, and diversity under a desired length limit. Expand
Abstractive Multi-Document Summarization via Joint Learning with Single-Document Summarization
TLDR
A unified model for single-document and multi-document summarizations is built by fully sharing the encoder and decoder and utilizing a decoding controller to aggregate the decoder’s outputs for multiple input documents. Expand
Multi-News: A Large-Scale Multi-Document Summarization Dataset and Abstractive Hierarchical Model
TLDR
This work introduces Multi-News, the first large-scale MDS news dataset, and proposes an end-to-end model which incorporates a traditional extractive summarization model with a standard SDS model and achieves competitive results on MDS datasets. Expand
Leveraging Graph to Improve Abstractive Multi-Document Summarization
TLDR
A neural abstractive multi-document summarization (MDS) model which can leverage well-known graph representations of documents, to more effectively process multiple input documents and produce abstractive summaries is developed. Expand
Multi-Granularity Interaction Network for Extractive and Abstractive Multi-Document Summarization
TLDR
This paper employs attention mechanisms to interact between different granularity of semantic representations, which helps to capture multi-granularity key information and improves the performance of both abstractive and extractive summarization. Expand
Massive Multi-Document Summarization of Product Reviews with Weak Supervision
TLDR
This work proposes a schema for summarizing a massive set of reviews on top of a standard summarization algorithm and shows that an initial implementation of the schema significantly improves over several baselines in ROUGE scores, and exhibits strong coherence in a manual linguistic quality assessment. Expand
Adapting the Neural Encoder-Decoder Framework from Single to Multi-Document Summarization
TLDR
An initial investigation into a novel adaptation method that exploits the maximal marginal relevance method to select representative sentences from multi-document input, and leverages an abstractive encoder-decoder model to fuse disparate sentences to an Abstractive summary. Expand
Entity-Aware Abstractive Multi-Document Summarization
  • Hao Zhou, Weidong Ren, Gongshen Liu, Bo Su, Wei Lu
  • Computer Science
  • FINDINGS
  • 2021
Entities and their mentions convey significant semantic information in documents. In multidocument summarization, the same entity may appear across different documents. Capturing such cross-documentExpand
Document-Based Question Answering Improves Query-Focused Multi-document Summarization
TLDR
A novel adaptation method to improve QMDS by using the relatively large datasets from DQA, which consists of a sentence encoder, a query filter and a document encoder that can model the sentence salience and query relevance well. Expand
Abstractive Multi-Document Summarization Based on Semantic Link Network
  • Wei Li, H. Zhuge
  • Computer Science
  • IEEE Transactions on Knowledge and Data Engineering
  • 2021
TLDR
Experiments on benchmark datasets show that the proposed summarization approach significantly outperforms relevant state-of-the-art baselines and the Semantic Link Network plays an important role in representing and understanding documents. Expand
...
1
2
3
...

References

SHOWING 1-10 OF 34 REFERENCES
Abstractive Document Summarization with a Graph-Based Attentional Neural Model
TLDR
A novel graph-based attention mechanism in the sequence-to-sequence framework to address the saliency factor of summarization, which has been overlooked by prior works and is competitive with state-of-the-art extractive methods. Expand
Abstractive Text Summarization using Sequence-to-sequence RNNs and Beyond
TLDR
This work proposes several novel models that address critical problems in summarization that are not adequately modeled by the basic architecture, such as modeling key-words, capturing the hierarchy of sentence-to-word structure, and emitting words that are rare or unseen at training time. Expand
A Neural Attention Model for Abstractive Sentence Summarization
TLDR
This work proposes a fully data-driven approach to abstractive sentence summarization by utilizing a local attention-based model that generates each word of the summary conditioned on the input sentence. Expand
A Deep Reinforced Model for Abstractive Summarization
TLDR
A neural network model with a novel intra-attention that attends over the input and continuously generated output separately, and a new training method that combines standard supervised word prediction and reinforcement learning (RL) that produces higher quality summaries. Expand
Get To The Point: Summarization with Pointer-Generator Networks
TLDR
A novel architecture that augments the standard sequence-to-sequence attentional model in two orthogonal ways, using a hybrid pointer-generator network that can copy words from the source text via pointing, which aids accurate reproduction of information, while retaining the ability to produce novel words through the generator. Expand
Using Supervised Bigram-based ILP for Extractive Summarization
TLDR
A bigram based supervised method for extractive document summarization in the integer linear programming (ILP) framework that consistently outperforms the previous ILP method on different TAC data sets, and performs competitively compared to the best results in the TAC evaluations. Expand
Query Focused Abstractive Summarization: Incorporating Query Relevance, Multi-Document Coverage, and Summary Length Constraints into seq2seq Models
TLDR
The method (Relevance Sensitive Attention for QFS) is compared to extractive baselines and with various ways to combine abstractive models on the DUC QFS datasets and with solid improvements on ROUGE performance. Expand
Recent advances in document summarization
TLDR
Significant contributions made in recent years are emphasized, including progress on modern sentence extraction approaches that improve concept coverage, information diversity and content coherence, as well as attempts from summarization frameworks that integrate sentence compression, and more abstractive systems that are able to produce completely new sentences. Expand
Abstractive Sentence Summarization with Attentive Recurrent Neural Networks
TLDR
A conditional recurrent neural network (RNN) which generates a summary of an input sentence which significantly outperforms the recently proposed state-of-the-art method on the Gigaword corpus while performing competitively on the DUC-2004 shared task. Expand
Towards Multidocument Summarization by Reformulation: Progress and Prospects
TLDR
The evaluation of system components shows that learning over multiple extracted linguistic features is more effective than information retrieval approaches at identifying similar text units for summarization and that it is possible to generate a fluent summary that conveys similarities among documents even when full semantic interpretations of the input text are not available. Expand
...
1
2
3
4
...