Corpus ID: 8244856

AttSum: Joint Learning of Focusing and Summarization with Neural Attention

@inproceedings{Cao2016AttSumJL,
  title={AttSum: Joint Learning of Focusing and Summarization with Neural Attention},
  author={Ziqiang Cao and Wenjie Li and Sujian Li and Furu Wei and Yanran Li},
  booktitle={COLING},
  year={2016}
}
Query relevance ranking and sentence saliency ranking are the two main tasks in extractive query-focused summarization. Previous supervised summarization systems often perform the two tasks in isolation. However, since reference summaries are the trade-off between relevance and saliency, using them as supervision, neither of the two rankers could be trained well. This paper proposes a novel summarization system called AttSum, which tackles the two tasks jointly. It automatically learns… Expand
Document-Based Question Answering Improves Query-Focused Multi-document Summarization
TLDR
A novel adaptation method to improve QMDS by using the relatively large datasets from DQA, which consists of a sentence encoder, a query filter and a document encoder that can model the sentence salience and query relevance well. Expand
Query-Focused Summarization Enhanced with Sentence Attention Mechanism
TLDR
This paper proposes a method to generate a summary by introducing a sentence unit vector in addition to the word unit vector of the original document, and aims to generate summaries considering the importance degree of a sentenceunit and the relations between sentences by learning both the attention mechanism of word unit and the attention mechanisms of sentence unit. Expand
Jointly Learning Topics in Sentence Embedding for Document Summarization
TLDR
A novel sentence embedding framework that combines sentence representations, word-based content, and topic assignments to predict the representation of the next sentence and further considers the associations between neighboring sentences is developed. Expand
A Comparative Study of Deep Learning Approaches for Query-Focused Extractive Multi-Document Summarization
  • Yuliska, Tetsuya Sakai
  • Computer Science
  • 2019 IEEE 2nd International Conference on Information and Computer Technologies (ICICT)
  • 2019
TLDR
This study is the first to compare deep learning techniques on extractive query-focused multi-document summarization and shows that Bi-LSTM with Max-pooling achieves the highest performance among the methods compared. Expand
Long-Span Language Models for Query-Focused Unsupervised Extractive Text Summarization
TLDR
Intrinsic and extrinsic experiments show that using the long-span models applied in an integer linear programming (ILP) formulation of MMR criterion are the most effective against several state-of-the-art baseline methods from the literature. Expand
Neural Related Work Summarization with a Joint Context-driven Attention Mechanism
TLDR
A neural data-driven summarizer is developed by leveraging the seq2seq paradigm, in which a joint context-driven attention mechanism is proposed to measure the contextual relevance within full texts and a heterogeneous bibliography graph simultaneously. Expand
Conditional Self-Attention for Query-based Summarization
TLDR
Experiments show CSA consistently outperforms vanilla Transformer and previous models for the Qsumm problem, and variants of CSA defined by different types of attention are studied. Expand
Transforming Wikipedia into Augmented Data for Query-Focused Summarization
TLDR
This paper uses Wikipedia to automatically collect a large query-focused summarization dataset (named as WIKIREF) of more than 280,000 examples, and develops a query- focused summarization model based on BERT to extract summaries from the documents. Expand
Unsupervised query-focused multi-document summarization based on transfer learning from sentence embedding models, BM25 model, and maximal marginal relevance criterion
TLDR
This paper proposes to leverage transfer learning from pre-trained sentence embedding models to represent documents’ sentences and users’ queries using embedding vectors that capture the semantic and the syntactic relationships between their constituents (words, phrases). Expand
Abstractive Summarization Improved by WordNet-Based Extractive Sentences
TLDR
A dual attentional seq2seq framework to generate summaries with consideration of the extracted information, and combines pointer-generator and coverage mechanisms to solve the problems of out-of-vocabulary words and duplicate words which exist in the abstractive models. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 45 REFERENCES
Optimizing Sentence Modeling and Selection for Document Summarization
TLDR
This paper attempts to build a strong summarizer DivSelect+CNNLM by presenting new algorithms to optimize each of them, and proposes CNNLM, a novel neural network language model (NNLM) based on convolutional neural network (CNN), to project sentences into dense distributed representations, then models sentence redundancy by cosine similarity. Expand
A Neural Attention Model for Abstractive Sentence Summarization
TLDR
This work proposes a fully data-driven approach to abstractive sentence summarization by utilizing a local attention-based model that generates each word of the summary conditioned on the input sentence. Expand
Ranking with Recursive Neural Networks and Its Application to Multi-Document Summarization
We develop a Ranking framework upon Recursive Neural Networks (R2N2) to rank sentences for multi-document summarization. It formulates the sentence ranking task as a hierarchical regression process,Expand
Learning Summary Prior Representation for Extractive Summarization
TLDR
A novel summary system called PriorSum is developed, which applies the enhanced convolutional neural networks to capture the summary prior features derived from length-variable phrases under a regression framework, and concatenated with document-dependent features for sentence ranking. Expand
Neural Summarization by Extracting Sentences and Words
TLDR
This work develops a general framework for single-document summarization composed of a hierarchical document encoder and an attention-based extractor that allows for different classes of summarization models which can extract sentences or words. Expand
Applying regression models to query-focused multi-document summarization
TLDR
This paper presents a different kind of learning models, namely regression models, to query-focused multi-document summarization, and chooses to use Support Vector Regression to estimate the importance of a sentence in a document set to be summarized through a set of pre-defined features. Expand
Using Supervised Bigram-based ILP for Extractive Summarization
TLDR
A bigram based supervised method for extractive document summarization in the integer linear programming (ILP) framework that consistently outperforms the previous ILP method on different TAC data sets, and performs competitively compared to the best results in the TAC evaluations. Expand
IIIT Hyderabad at DUC 2007
TLDR
This year a term clustering approach was used to better estimate a sentence prior in the update summarization task and it was found that the sentence prior’s performance is comparable with the top performing systems. Expand
LCSTS: A Large Scale Chinese Short Text Summarization Dataset
TLDR
A large corpus of Chinese short text summarization dataset constructed from the Chinese microblogging website Sina Weibo is introduced and recurrent neural network is introduced for the summary generation and promising results are achieved. Expand
The use of MMR, diversity-based reranking for reordering documents and producing summaries
TLDR
This paper presents a method for combining query-relevance with information-novelty in the context of text retrieval and summarization, and preliminary results indicate some benefits for MMR diversity ranking in document retrieval and in single document summarization. Expand
...
1
2
3
4
5
...