Hybrid MemNet for Extractive Summarization

@article{Singh2017HybridMF,
  title={Hybrid MemNet for Extractive Summarization},
  author={A. Singh and Manish Gupta and Vasudeva Varma},
  journal={Proceedings of the 2017 ACM on Conference on Information and Knowledge Management},
  year={2017}
}
Extractive text summarization has been an extensive research problem in the field of natural language understanding. While the conventional approaches rely mostly on manually compiled features to generate the summary, few attempts have been made in developing data-driven systems for extractive summarization. To this end, we present a fully data-driven end-to-end deep network which we call as Hybrid MemNet for single document summarization task. The network learns the continuous unified… Expand
Unity in Diversity: Learning Distributed Heterogeneous Sentence Representation for Extractive Summarization
TLDR
This work develops a data-driven novel summary system called HNet, which exploits the various semantic and compositional aspects latent in a sentence to capture document independent features. Expand
BoWLer: A neural approach to extractive text summarization
TLDR
This work presents a simple, yet effective approach for extractive summarization of news articles with an encoder-decoder architecture with a simple bag of word encoder for sentences followed by an attention based decoder for relevant sentence selection. Expand
Iterative Document Representation Learning Towards Summarization with Polishing
In this paper, we introduce Iterative Text Summarization (ITS), an iteration-based model for supervised extractive text summarization, inspired by the observation that it is often necessary for aExpand
Summarization for LaySumm ’ 20 and LongSumm ’ 20
  • 2020
TLDR
This paper distinguishes between two types of summaries, namely, a very short summary that captures the essence of the research paper in layman terms restricting overtly specific technical jargon and a much longer detailed summary aimed at providing specific insights into various ideas touched upon in the paper. Expand
Summaformers @ LaySumm 20, LongSumm 20
TLDR
This paper distinguishes between two types of summaries, namely, a very short summary that captures the essence of the research paper in layman terms and a much longer detailed summary aimed at providing specific insights into various ideas touched upon in the paper. Expand
Abstractive Summarization of Reddit Posts with Multi-level Memory Networks
TLDR
This work collects Reddit TIFU dataset, consisting of 120K posts from the online discussion forum Reddit, and proposes a novel abstractive summarization model named multi-level memory networks (MMN), equipped with multi- level memory to store the information of text from different levels of abstraction. Expand
Scientific Document Summarization for LaySumm ’20 and LongSumm ’20
Automatic text summarization has been widely studied as an important task in natural language processing. Traditionally, various feature engineering and machine learning based systems have beenExpand
Fusion of Intrinsic & Extrinsic Sentential Traits for Text Coherence Assessment
TLDR
A data-driven end-to-end novel neural coherence model that captures text coherence by exploiting the semantic and distributional aspects of the sentences in a document by incorporating document dependent and document independent features of a sentence. Expand
Improving Online Forums Summarization via Unifying Hierarchical Attention Networks with Convolutional Neural Networks
TLDR
This study aims to create an automatic text summarizer for online forums to mitigate this problem, and presents a framework based on hierarchical attention networks, unifying Bidirectional Long Short-Term Memory (Bi-LSTM) and Convolutional Neural Network to build sentence and thread representations for the forum summarization. Expand
Environment and Speaker Related Emotion Recognition in Conversations
TLDR
This work obtained near state-of-the-art results in ERC on MELD at a fraction of the usual inference time simply by incorporating information about the speakers and their environments obtained via LDA (Latent Dirichlet Allocation) into DialoGPT, thus without using any additional labels. Expand

References

SHOWING 1-10 OF 23 REFERENCES
Neural Summarization by Extracting Sentences and Words
TLDR
This work develops a general framework for single-document summarization composed of a hierarchical document encoder and an attention-based extractor that allows for different classes of summarization models which can extract sentences or words. Expand
SummaRuNNer: A Recurrent Neural Network Based Sequence Model for Extractive Summarization of Documents
We present SummaRuNNer, a Recurrent Neural Network (RNN) based sequence model for extractive summarization of documents and show that it achieves performance better than or comparable toExpand
Classify or Select: Neural Architectures for Extractive Document Summarization
TLDR
Two novel and contrasting Recurrent Neural Network (RNN) based architectures for extractive summarization of documents are presented and the models under both architectures jointly capture the notions of salience and redundancy of sentences. Expand
Event-Based Extractive Summarization
TLDR
The experimental results indicate that not only the event-based features offer an improvement in summary quality over words as features, but that this effect is more pronounced for more sophisticated summarization methods that avoid redundancy in the output. Expand
Optimizing Sentence Modeling and Selection for Document Summarization
TLDR
This paper attempts to build a strong summarizer DivSelect+CNNLM by presenting new algorithms to optimize each of them, and proposes CNNLM, a novel neural network language model (NNLM) based on convolutional neural network (CNN), to project sentences into dense distributed representations, then models sentence redundancy by cosine similarity. Expand
Extractive Summarization using Continuous Vector Space Models
TLDR
This paper proposes the use of continuous vector representations for semantically aware representations of sentences as a basis for measuring similarity and evaluates different compositions for sentence representation on a standard dataset using the ROUGE evaluation measures. Expand
A compositional context sensitive multi-document summarizer: exploring the factors that influence summarization
TLDR
The research shows that a frequency based summarizer can achieve performance comparable to that of state-of-the-art systems, but only with a good composition function; context sensitivity improves performance and significantly reduces repetition. Expand
Topical Coherence for Graph-based Extractive Summarization
We present an approach for extractive single-document summarization. Our approach is based on a weighted graphical representation of documents obtained by topic modeling. We optimize importance,Expand
Language Independent Extractive Summarization
We demonstrate TextRank -- a system for unsupervised extractive summarization that relies on the application of iterative graph-based ranking algorithms to graphs encoding the cohesive structure of aExpand
Automatic Generation of Story Highlights
TLDR
Experimental results show that the model's output is comparable to human-written highlights in terms of both grammaticality and content. Expand
...
1
2
3
...