Topic-Aware Pointer-Generator Networks for Summarizing Spoken Conversations

@article{Liu2019TopicAwarePN,
  title={Topic-Aware Pointer-Generator Networks for Summarizing Spoken Conversations},
  author={Zhengyuan Liu and Angela Ng and Sheldon Lee Shao Guang and AiTi Aw and Nancy F. Chen},
  journal={2019 IEEE Automatic Speech Recognition and Understanding Workshop (ASRU)},
  year={2019},
  pages={814-821}
}
Due to the lack of publicly available resources, conversation summarization has received far less attention than text summarization. As the purpose of conversations is to exchange information between at least two interlocutors, key information about a certain topic is often scattered and spanned across multiple utterances and turns from different speakers. This phenomenon is more pronounced during spoken conversations, where speech characteristics such as backchanneling and false-starts might… Expand
Multi-View Sequence-to-Sequence Models with Conversational Structure for Abstractive Dialogue Summarization
TLDR
This work proposes a multi-view sequence-to-sequence model by first extracting conversational structures of unstructured daily chats from different views to represent conversations and then utilizing amulti-view decoder to incorporate differentViews to generate dialogue summaries. Expand
Topic-Aware Contrastive Learning for Abstractive Dialogue Summarization
  • Junpeng Liu, Yanyan Zou, +4 authors Xiaojie Wang
  • Computer Science
  • ArXiv
  • 2021
TLDR
This work proposes two topic-aware contrastive learning objectives, namely coherence detection and sub-summary generation objectives, which are expected to implicitly model the topic change and handle information scattering challenges for the dialogue summarization task. Expand
Improving Abstractive Dialogue Summarization with Graph Structures and Topic Words
TLDR
A Topic-word Guided Dialogue Graph Attention (TGDGA) network is proposed to model the dialogue as an interaction graph according to the topic word information and a masked graph self-attention mechanism is used to integrate cross-sentence information flows and focus more on the related utterances, which makes it better to understand the dialogue. Expand
Dialogue Discourse-Aware Graph Convolutional Networks for Abstractive Meeting Summarization
TLDR
A Dialogue Discourse-Aware Graph Convolutional Networks (DDA-GCN) for meeting summarization is developed by utilizing dialogue discourse, which is a dialogue-specific structure that can provide pre-defined semantic relationships between each utterance. Expand
ConvoSumm: Conversation Summarization Benchmark and Improved Abstractive Summarization with Argument Mining
TLDR
Annotation protocols motivated by an issues–viewpoints–assertions framework are designed to crowdsource four new datasets on diverse online conversation forms of news comments, discussion forums, community question answering forums, and email threads, and benchmark state-of-the-art models on these datasets and analyze characteristics associated with the data. Expand
Coreference-Aware Dialogue Summarization
TLDR
Experimental results show that the proposed approaches achieve state-of-the-art performance, implying it is useful to utilize coreference information in dialogue summarization, and evaluation results on factual correctness suggest such coreferenceaware models are better at tracing the information flow among interlocutors and associating accurate status/actions with the corresponding interlocUTors and person mentions. Expand
Improving Detection and Categorization of Task-relevant Utterances through Integration of Discourse Structure and Ontological Knowledge
TLDR
This paper proposes the novel modeling approach MedFilter, which addresses insights in order to increase performance at identifying and categorizing task-relevant utterances, and in so doing, positively impacts performance at a downstream information extraction task. Expand
End-to-End Abstractive Summarization for Meetings
TLDR
A novel end-to-end abstractive summary network that adapts to the meeting scenario is proposed, with a role vector to depict the difference among speakers and a hierarchical structure to accommodate long meeting transcripts. Expand
Dynamic Sliding Window for Meeting Summarization
  • Zhengyuan Liu, Nancy F. Chen
  • Computer Science
  • ArXiv
  • 2021
TLDR
This work first analyzes the linguistic characteristics of meeting transcripts on a representative corpus, and finds that the sentences comprising the summary correlate with the meeting agenda, and proposes a dynamic sliding window strategy for meeting summarization. Expand
MedFilter: Extracting Task-relevant Utterances from Medical Dialogue through Integration of Discourse Structure and Ontological Knowledge
TLDR
This paper proposes the novel modeling approach MedFilter, which addresses insights in order to increase performance at identifying and categorizing task-relevant utterances, and in so doing, positively impacts performance at a downstream information extraction task. Expand
...
1
2
3
4
...

References

SHOWING 1-10 OF 39 REFERENCES
A Hierarchical Neural Summarization Framework for Spoken Documents
TLDR
A neural summarization framework with the flexibility to incorporate extra acoustic/prosodic and lexical features, for which the ROUGE evaluation metric is embedded into the training objective function and can be optimized with reinforcement learning is proposed. Expand
Abstractive Dialogue Summarization with Sentence-Gated Modeling Optimized by Dialogue Acts
TLDR
The proposed model significantly improves the abstractive summarization performance compared to the state-of-the-art baselines on the AMI meeting corpus, demonstrating the usefulness of the interactive signal provided by dialogue acts. Expand
Improving lecture speech summarization using rhetorical information
TLDR
It is shown that, despite a 29.7% character error rate in speech recognition, extractive summarization performs relatively well, underlining the fact that spontaneity in lecture speech does not affect the central meaning of lecture speech. Expand
Abstractive Sentence Summarization with Attentive Recurrent Neural Networks
TLDR
A conditional recurrent neural network (RNN) which generates a summary of an input sentence which significantly outperforms the recently proposed state-of-the-art method on the Gigaword corpus while performing competitively on the DUC-2004 shared task. Expand
Bootstrapping a Neural Conversational Agent with Dialogue Self-Play, Crowdsourcing and On-Line Reinforcement Learning
TLDR
This paper discusses the advantages of this approach for industry applications of conversational agents, wherein an agent can be rapidly bootstrapped to deploy in front of users and further optimized via interactive learning from actual users of the system. Expand
Get To The Point: Summarization with Pointer-Generator Networks
TLDR
A novel architecture that augments the standard sequence-to-sequence attentional model in two orthogonal ways, using a hybrid pointer-generator network that can copy words from the source text via pointing, which aids accurate reproduction of information, while retaining the ability to produce novel words through the generator. Expand
Improving supervised learning for meeting summarization using sampling and regression
TLDR
This paper reframe the extractive summarization task using a regression scheme instead of binary classification, and evaluates the approaches using the ICSI meeting corpus on both the human transcripts and speech recognition output, and shows performance improvement using different sampling methods and regression model. Expand
Bottom-Up Abstractive Summarization
TLDR
This work explores the use of data-efficient content selectors to over-determine phrases in a source document that should be part of the summary, and shows that this approach improves the ability to compress text, while still generating fluent summaries. Expand
Abstractive Text Summarization using Sequence-to-sequence RNNs and Beyond
TLDR
This work proposes several novel models that address critical problems in summarization that are not adequately modeled by the basic architecture, such as modeling key-words, capturing the hierarchy of sentence-to-word structure, and emitting words that are rare or unseen at training time. Expand
A Deep Reinforced Model for Abstractive Summarization
TLDR
A neural network model with a novel intra-attention that attends over the input and continuously generated output separately, and a new training method that combines standard supervised word prediction and reinforcement learning (RL) that produces higher quality summaries. Expand
...
1
2
3
4
...