Transfer Learning for Abstractive Summarization at Controllable Budgets
@article{Sarkhel2020TransferLF, title={Transfer Learning for Abstractive Summarization at Controllable Budgets}, author={Ritesh Sarkhel and Moniba Keymanesh and Arnab Nandi and Srinivasan Parthasarathy}, journal={ArXiv}, year={2020}, volume={abs/2002.07845} }
Summarizing a document within an allocated budget while maintaining its major concepts is a challenging task. If the budget can take any arbitrary value and not known beforehand, it becomes even more difficult. Most of the existing methods for abstractive summarization, including state-of-the-art neural networks are data intensive. If the number of available training samples becomes limited, they fail to construct high-quality summaries. We propose MLS, an end-to-end framework to generate…
Figures and Tables from this paper
2 Citations
MeetSum: Transforming Meeting Transcript Summarization using Transformers!
- Computer ScienceArXiv
- 2021
A qualitative analysis of the summaries generated by the improved Transformer-based Pointer Generator Network shows that these summaries and human-readable and indeed capture most of the important information from the transcripts.
Toward Domain-Guided Controllable Summarization of Privacy Policies
- Computer ScienceNLLP@KDD
- 2020
This work proposes a hybrid approach to identify sections of privacy policies with a high privacy risk factor and incorporates these sections into summaries by selecting the riskiest content from different privacy topics using plain English reference summaries.
References
SHOWING 1-10 OF 39 REFERENCES
Fast Abstractive Summarization with Reinforce-Selected Sentence Rewriting
- Computer ScienceACL
- 2018
An accurate and fast summarization model that first selects salient sentences and then rewrites them abstractively to generate a concise overall summary is proposed, which achieves the new state-of-the-art on all metrics on the CNN/Daily Mail dataset, as well as significantly higher abstractiveness scores.
Jointly Learning to Extract and Compress
- Computer ScienceACL
- 2011
A joint model of sentence extraction and compression for multi-document summarization and its jointly extracted and compressed summaries outperform both unlearned baselines and the authors' learned extraction-only system on both ROUGE and Pyramid, without a drop in judged linguistic quality.
Abstractive Sentence Summarization with Attentive Recurrent Neural Networks
- Computer ScienceNAACL
- 2016
A conditional recurrent neural network (RNN) which generates a summary of an input sentence which significantly outperforms the recently proposed state-of-the-art method on the Gigaword corpus while performing competitively on the DUC-2004 shared task.
Get To The Point: Summarization with Pointer-Generator Networks
- Computer ScienceACL
- 2017
A novel architecture that augments the standard sequence-to-sequence attentional model in two orthogonal ways, using a hybrid pointer-generator network that can copy words from the source text via pointing, which aids accurate reproduction of information, while retaining the ability to produce novel words through the generator.
A Neural Attention Model for Abstractive Sentence Summarization
- Computer ScienceEMNLP
- 2015
This work proposes a fully data-driven approach to abstractive sentence summarization by utilizing a local attention-based model that generates each word of the summary conditioned on the input sentence.
Controllable Abstractive Summarization
- Computer ScienceNMT@ACL
- 2018
A neural summarization model with a simple but effective mechanism to enable users to specify high level attributes in order to control the shape of the final summaries to better suit their needs.
Controlling Length in Abstractive Summarization Using a Convolutional Neural Network
- Computer ScienceEMNLP
- 2018
This paper proposes an approach to constrain the summary length by extending a convolutional sequence to sequence model, and shows that this approach generates high-quality summaries with user defined length, and outperforms the baselines consistently in terms of ROUGE score, length variations and semantic similarity.
Statistics-Based Summarization - Step One: Sentence Compression
- Computer ScienceAAAI/IAAI
- 2000
This paper focuses on sentence compression, a simpler version of this larger challenge, and aims to achieve two goals simultaneously: the compressions should be grammatical, and they should retain the most important pieces of information.
Recent advances in document summarization
- Computer ScienceKnowledge and Information Systems
- 2017
Significant contributions made in recent years are emphasized, including progress on modern sentence extraction approaches that improve concept coverage, information diversity and content coherence, as well as attempts from summarization frameworks that integrate sentence compression, and more abstractive systems that are able to produce completely new sentences.
Language Modeling with Gated Convolutional Networks
- Computer ScienceICML
- 2017
A finite context approach through stacked convolutions, which can be more efficient since they allow parallelization over sequential tokens, is developed and is the first time a non-recurrent approach is competitive with strong recurrent models on these large scale language tasks.