You Impress Me: Dialogue Generation via Mutual Persona Perception

  title={You Impress Me: Dialogue Generation via Mutual Persona Perception},
  author={Qian Liu and Yihong Chen and B. Chen and Jian-Guang Lou and Zixuan Chen and Bin Zhou and Dongmei Zhang},
Despite the continuing efforts to improve the engagingness and consistency of chit-chat dialogue systems, the majority of current work simply focus on mimicking human-like responses, leaving understudied the aspects of modeling understanding between interlocutors. The research in cognitive science, instead, suggests that understanding is an essential signal for a high-quality chit-chat conversation. Motivated by this, we propose Pˆ2 Bot, a transmitter-receiver based framework with the aim of… 

Figures and Tables from this paper

Partner Personas Generation for Dialogue Response Generation

Incorporating personas information allows diverse and engaging responses in dialogue response generation. Unfortunately, prior works have primarily focused on self personas and have overlooked the

A Personalized Dialogue Generator with Implicit User Persona Detection

This work proposes a novel personalized dialogue generator by detecting an implicit user persona and attempting to model the user’s potential persona and its representation from dialogue history, with no external knowledge.

Towards Building a Personalized Dialogue Generator via Implicit User Persona Detection

This work proposes a novel personalized dialogue generator by detecting implicit user persona that is more concerned with the user’s persona and outperforms in evaluations.

Structural Characterization for Dialogue Disentanglement

This work specially takes structure factors into account and design a novel model for dialogue disentangling that achieves new state-of-the-art on the Ubuntu IRC benchmark dataset and contributes to dialogue-related comprehension.

Is Your Chatbot Perplexing?: Confident Personalized Conversational Agent for Consistent Chit-Chat Dialogue

A novel framework that processes complex data consisting of personalities and utterances and fine-tuned a large-scale self-attention-based language model is built and a consistent personalized conversational agent (CPC-Agent) is proposed for the framework to achieve accuracy and consistency.

Generate, Delete and Rewrite: A Three-Stage Framework for Improving Persona Consistency of Dialogue Generation

This work introduces a three-stage framework that employs a generate-delete-rewrite mechanism to delete inconsistent words from a generated response prototype and further rewrite it to a personality-consistent one.

Will I Sound like Me? Improving Persona Consistency in Dialogues through Pragmatic Self-Consciousness

Inspired by social cognition and pragmatics, existing dialogue agents are endow with public self-consciousness on the fly through an imaginary listener to enforce dialogue agents to refrain from uttering contradiction and improve consistency of existing dialogue models.

Group-wise Contrastive Learning for Neural Dialogue Generation

This work introduces contrastive learning into dialogue generation, where the model explicitly perceives the difference between the well-chosen positive and negative utterances, and augments contrastive dialogue learning with group-wise dual sampling.

Partner Matters! An Empirical Study on Fusing Personas for Personalized Response Selection in Retrieval-Based Chatbots

This paper makes an attempt to thoroughly explore the impact of utilizing personas that describe either self or partner speakers on the task of response selection in retrieval-based chatbots.

BoB: BERT Over BERT for Training Persona-based Dialogue Models from Limited Personalized Data

This work shows how persona-based dialogue generation can be addressed by disentangled into two sub-tasks with a novel BERT-over-BERT (BoB) model, and demonstrates that the proposed model outperforms strong baselines in response quality and persona consistency.



Exploiting Persona Information for Diverse Generation of Conversational Responses

Both automatic and human evaluations show that the proposed memory-augmented architecture to exploit persona information from context and incorporate a conditional variational autoencoder model together to generate diverse and sustainable conversations.

Personalizing Dialogue Agents via Meta-Learning

This paper proposes to extend Model-Agnostic Meta-Learning (MAML) to personalized dialogue learning without using any persona descriptions, and demonstrates that its model outperforms non-meta-learning baselines using automatic evaluation metrics, and in terms of human-evaluated fluency and consistency.

Personalizing Dialogue Agents: I have a dog, do you have pets too?

This work collects data and train models tocondition on their given profile information; and information about the person they are talking to, resulting in improved dialogues, as measured by next utterance prediction.

Training Millions of Personalized Dialogue Agents

A new dataset providing 5 million personas and 700 million persona-based dialogues is introduced and it is shown that, at this scale, training using personas still improves the performance of end-to-end systems.

Generating Persona Consistent Dialogues by Exploiting Natural Language Inference

Experimental results on both human and automatic metrics, including the model-based consistency evaluation, demonstrate that the proposed approach outperforms strong generative baselines, especially in the persona-consistency of generated responses.

Deep Reinforcement Learning for Dialogue Generation

This work simulates dialogues between two virtual agents, using policy gradient methods to reward sequences that display three useful conversational properties: informativity, non-repetitive turns, coherence, and ease of answering.

The Second Conversational Intelligence Challenge (ConvAI2)

To improve performance on multi-turn conversations with humans, future systems must go beyond single word metrics like perplexity to measure the performance across sequences of utterances (conversations)—in terms of repetition, consistency and balance of dialogue acts.

TransferTransfo: A Transfer Learning Approach for Neural Network Based Conversational Agents

A new approach to generative data-driven dialogue systems (e.g. chatbots) called TransferTransfo is introduced which is a combination of a Transfer learning based training scheme and a high-capacity Transformer model which shows strong improvements over the current state-of-the-art end-to-end conversational models.

An Adversarial Learning Framework For A Persona-Based Multi-Turn Dialogue Model

In this paper, we extend the persona-based sequence-to-sequence (Seq2Seq) neural network conversation model to a multi-turn dialogue scenario by modifying the state-of-the-art hredGAN architecture to

Aiming to Know You Better Perhaps Makes Me a More Engaging Dialogue Partner

This work defines a quantitative metric for a plausible motivation for a chit-chat dialogue agent that specifically focuses on discovering information about its interlocutor and proposes an algorithm for the agent to maximize it.