Topic-Aware Contrastive Learning for Abstractive Dialogue Summarization

  title={Topic-Aware Contrastive Learning for Abstractive Dialogue Summarization},
  author={Junpeng Liu and Yanyan Zou and Hainan Zhang and Hongshen Chen and Zhuoye Ding and Caixia Yuan and Xiaojie Wang},
Unlike well-structured text, such as news reports and encyclopedia articles, dialogue content often comes from two or more interlocutors, exchanging information with each other. In such a scenario, the topic of a conversation can vary upon progression and the key information for a certain topic is often scattered across multiple utterances of different speakers, which poses challenges to abstractly summarize dialogues. To capture the various topic information of a conversation and outline… 
CONFIT: Toward Faithful Dialogue Summarization with Linguistically-Informed Contrastive Fine-tuning
This work provides a typology of factual errors with annotation data to highlight the types of errors and move away from a binary understanding of factuality, and proposes a training strategy that improves the factual consistency and overall quality of summaries via a novel contrastive fine-tuning, called CONFIT.


Multi-View Sequence-to-Sequence Models with Conversational Structure for Abstractive Dialogue Summarization
This work proposes a multi-view sequence-to-sequence model by first extracting conversational structures of unstructured daily chats from different views to represent conversations and then utilizing amulti-view decoder to incorporate differentViews to generate dialogue summaries.
Topic-Aware Pointer-Generator Networks for Summarizing Spoken Conversations
This work proposes a topic-aware architecture to exploit the inherent hierarchical structure in conversations to further adapt the pointer-generator model, which significantly outperforms competitive baselines, achieves more efficient learning outcomes, and attains more robust performance.
Abstractive Dialogue Summarization with Sentence-Gated Modeling Optimized by Dialogue Acts
The proposed model significantly improves the abstractive summarization performance compared to the state-of-the-art baselines on the AMI meeting corpus, demonstrating the usefulness of the interactive signal provided by dialogue acts.
Keep Meeting Summaries on Topic: Abstractive Multi-Modal Meeting Summarization
An abstractive meeting summarizer from both videos and audios of meeting recordings is developed, which significantly outperforms the state-of-the-art with both BLEU and ROUGE measures.
Abstractive Meeting Summarization via Hierarchical Adaptive Segmental Network Learning
This paper proposes the hierarchical neural encoder based on adaptive recurrent networks to learn the semantic representation of meeting conversation with adaptive conversation segmentation and develops the reinforced decoder network to generate the high-quality summaries for abstractive meeting summarization.
End-to-End Abstractive Summarization for Meetings
A novel end-to-end abstractive summary network that adapts to the meeting scenario is proposed, with a role vector to depict the difference among speakers and a hierarchical structure to accommodate long meeting transcripts.
SAMSum Corpus: A Human-annotated Dialogue Dataset for Abstractive Summarization
This study is the first attempt to introduce a high-quality chat-dialogues corpus, manually annotated with abstractive summarizations, which can be used by the research community for further studies and suggests that a challenging task of abstractive dialogue summarization requires dedicated models and non-standard quality measures.
Automatic Dialogue Summary Generation for Customer Service
This paper introduces auxiliary key point sequences to solve the problem of dialogue summarization and proposes a novel Leader-Writer network that outperforms other models not only on BLEU and ROUGE-L score but also on logic and integrity.
Reading Turn by Turn: Hierarchical Attention Architecture for Spoken Dialogue Comprehension
This work proposes a hierarchical attention neural network architecture, combining turn-level and word-level attention mechanisms, to improve spoken dialogue comprehension performance, and empirically shows that the proposed approach outperforms standard attention baselines, achieves more efficient learning outcomes, and is more robust to lengthy and out-of-distribution test samples.
A Deep Reinforced Model for Abstractive Summarization
A neural network model with a novel intra-attention that attends over the input and continuously generated output separately, and a new training method that combines standard supervised word prediction and reinforcement learning (RL) that produces higher quality summaries.