Commonsense Knowledge Aware Conversation Generation with Graph Attention

@inproceedings{Zhou2018CommonsenseKA,
  title={Commonsense Knowledge Aware Conversation Generation with Graph Attention},
  author={Hao Zhou and Tom Young and Minlie Huang and Haizhou Zhao and Jingfang Xu and Xiaoyan Zhu},
  booktitle={IJCAI},
  year={2018}
}
Commonsense knowledge is vital to many natural language processing tasks. [] Key Method Given a user post, the model retrieves relevant knowledge graphs from a knowledge base and then encodes the graphs with a static graph attention mechanism, which augments the semantic information of the post and thus supports better understanding of the post.

Figures and Tables from this paper

Knowledge Aware Conversation Generation with Explainable Reasoning over Augmented Graphs
TLDR
This work proposes a knowledge aware chatting machine with three components, an augmented knowledge graph with both triples and texts, knowledge selector, and knowledge aware response generator, and improves a state of the art reasoning algorithm with machine reading comprehension technology for knowledge selection on the graph.
Knowledge Aware Conversation Generation with Reasoning on Augmented Graph
TLDR
This work proposes a knowledge aware chatting machine with three components, augmented knowledge graph containing both factoid and non-factoid knowledge, knowledge selector, and response generator, and demonstrates that supported by such unified knowledge and knowledge selection method, the system can generate more appropriate and informative responses than baselines.
Language Generation with Multi-hop Reasoning on Commonsense Knowledge Graph
TLDR
This paper proposes Generation with Multi-Hop Reasoning Flow (GRF) that enables pre-trained models with dynamic multi-hop reasoning on multi-relational paths extracted from the external commonsense knowledge graph and empirically shows that the model outperforms existing baselines on three text generation tasks that require reasoning over Commonsense knowledge.
Variational Attention for Commonsense Knowledge Aware Conversation Generation
TLDR
A novel commonsense knowledge aware conversation generation model is presented, which adopts variational attention for incorporating Commonsense knowledge to generate more appropriate conversation.
Knowledge-aware Dialogue Generation with Hybrid Attention (Student Abstract)
TLDR
A knowledge-aware dialogue generation model which uses hybrid attention to better generate rational entities and graph attention in encoding and the dynamic graph attention mechanism is used to select knowledge and generate response.
Diverse and Informative Dialogue Generation with Context-Specific Commonsense Knowledge Awareness
TLDR
A novel commonsense knowledge-aware dialogue generation model, ConKADI, with a Felicitous Fact mechanism to help the model focus on the knowledge facts that are highly relevant to the context; furthermore, two techniques, Context-Knowledge Fusion and Flexible Mode Fusion are proposed to facilitate the integration of the knowledge in the ConK ADI.
DyKgChat: Benchmarking Dialogue Generation Grounding on Dynamic Knowledge Graphs
TLDR
A new task about how to apply dynamic knowledge graphs in neural conversation model is proposed and a novel TV series conversation corpus (DyKgChat) is presented for the task and it is shown that the proposed approach outperforms previous knowledge-grounded conversation models.
KG-CRuSE: Recurrent Walks over Knowledge Graph for Explainable Conversation Reasoning using Semantic Embeddings
TLDR
KG-CRUSE is proposed, a simple, yet effective LSTM based decoder that utilises the semantic information in the 016 dialogue history and the knowledge graph ele017 ments to generate such paths for effective con018 versation explanation.
Multiple Knowledge Syncretic Transformer for Natural Dialogue Generation
TLDR
A novel universal transformer-based architecture for dialogue system, the Multiple Knowledge Syncretic Transformer (MKST), which fuses multi-knowledge in open-domain conversation and achieves significant improvement on knowledge-driven dialogue generation tasks than state-of-the-art baselines.
DialoKG: Knowledge-Structure Aware Task-Oriented Dialogue Generation
TLDR
This paper proposes DialoKG, a novel task-oriented dialogue system that effectively incorporates knowledge into a language model and introduces a structure-aware knowledge embedding technique and a knowledge graph-weighted attention masking strategy to facilitate the system selecting relevant information during the dialogue generation.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 39 REFERENCES
Incorporating loose-structured knowledge into conversation modeling via recall-gate LSTM
TLDR
A deep neural network is proposed to incorporate background knowledge as background knowledge for conversation modeling through a recall mechanism with a specially designed recall-gate, so as to enrich the ability of LSTM to capture the implicit semantic clues in conversations.
Reasoning with Heterogeneous Knowledge for Commonsense Machine Comprehension
TLDR
A multi-knowledge reasoning model is proposed, which selects inference rules for a specific reasoning context using attention mechanism, and reasons by summarizing all valid inference rules.
Topic Aware Neural Response Generation
TLDR
A topic aware sequence-to-sequence (TA-Seq2Seq) model that utilizes topics to simulate prior knowledge of human that guides them to form informative and interesting responses in conversation, and leverages the topic information in generation by a joint attention mechanism and a biased generation probability.
Exploiting knowledge base to generate responses for natural language dialog listening agents
TLDR
A natural language dialog listening agent that uses a knowledge base (KB) to generate rich and relevant responses and encouraged users to continue talking is developed.
A Knowledge Enhanced Generative Conversational Service Agent
TLDR
A knowledge enhanced sequence-to-sequence framework is designed to model multi-turn dialogs on external knowledge conditionally to generate natural and informative responses for customer service oriented dialog incorporating external knowledge.
Sequence to Backward and Forward Sequences: A Content-Introducing Approach to Generative Short-Text Conversation
TLDR
This paper proposes seq2BF, a “sequence to backward and forward sequences” model, which generates a reply containing the given keyword, and significantly outperforms traditional sequence-to-sequence models in terms of human evaluation and the entropy measure.
Conditional Generative Adversarial Networks for Commonsense Machine Comprehension
TLDR
A Conditional GANs (CGANs) in which the generator is conditioned by the context and the advantage of the CGANs in discriminating sentence is shown to achieve state-of-the-art results in commonsense story reading comprehension task compared with previous feature engineering and deep learning methods.
Incorporating Copying Mechanism in Sequence-to-Sequence Learning
TLDR
This paper incorporates copying into neural network-based Seq2Seq learning and proposes a new model called CopyNet with encoder-decoder structure which can nicely integrate the regular way of word generation in the decoder with the new copying mechanism which can choose sub-sequences in the input sequence and put them at proper places in the output sequence.
A Diversity-Promoting Objective Function for Neural Conversation Models
TLDR
This work proposes using Maximum Mutual Information (MMI) as the objective function in neural models, and demonstrates that the proposed MMI models produce more diverse, interesting, and appropriate responses, yielding substantive gains in BLEU scores on two conversational datasets and in human evaluations.
Sequence to Sequence Learning with Neural Networks
TLDR
This paper presents a general end-to-end approach to sequence learning that makes minimal assumptions on the sequence structure, and finds that reversing the order of the words in all source sentences improved the LSTM's performance markedly, because doing so introduced many short term dependencies between the source and the target sentence which made the optimization problem easier.
...
1
2
3
4
...