Modeling Naive Psychology of Characters in Simple Commonsense Stories

@inproceedings{Rashkin2018ModelingNP,
  title={Modeling Naive Psychology of Characters in Simple Commonsense Stories},
  author={Hannah Rashkin and Antoine Bosselut and Maarten Sap and Kevin Knight and Yejin Choi},
  booktitle={ACL},
  year={2018}
}
Understanding a narrative requires reading between the lines and reasoning about the unspoken but obvious implications about events and people’s mental states — a capability that is trivial for humans but remarkably hard for machines. To facilitate research addressing this challenge, we introduce a new annotation framework to explain naive psychology of story characters as fully-specified chains of mental states with respect to motivations and emotional reactions. Our work presents a new large… Expand
Modeling Human Mental States with an Entity-based Narrative Graph
TLDR
This paper proposes an Entity-based Narrative Graph (ENG) to model the internal- states of characters in a story, and explicitly model entities, their interactions and the context in which they appear, and learn rich representations for them. Expand
Narrative Theory for Computational Narrative Understanding∗
Over the past decade, the field of natural language processing has developed a wide array of computational methods for reasoning about narrative, including summarization, commonsense inference, andExpand
Modeling Human Motives and Emotions from Personal Narratives Using External Knowledge And Entity Tracking
TLDR
This work proposes a Transformer-based architecture, referred to as Transformer, to model characters’ motives and emotions from personal narratives and shows that the learned mental state embeddings can be applied in downstream tasks such as empathetic response generation. Expand
Cosmos QA: Machine Reading Comprehension with Contextual Commonsense Reasoning
TLDR
This paper introduces Cosmos QA, a large-scale dataset of 35,600 problems that require commonsense-based reading comprehension, formulated as multiple-choice questions, and proposes a new architecture that improves over the competitive baselines. Expand
Story Generation Using Knowledge Graph under Psychological States
TLDR
A Knowledge-Aware Generation framework under Controllable CondItions (K-GuCCI), which assigns a change line of psychological states to story characters, which makes the story develop following the setting and incorporates the knowledge graph into the model to facilitate the coherence of the story. Expand
A Knowledge-Enhanced Pretraining Model for Commonsense Story Generation
TLDR
A knowledge-enhanced pretraining model to utilize commonsense knowledge from external knowledge bases to generate reasonable stories that can generate more reasonable stories than state-of-the-art baselines, particularly in terms of logic and global coherence. Expand
Unsupervised Deep Structured Semantic Models for Commonsense Reasoning
TLDR
Two neural network models based on the Deep Structured Semantic Models (DSSM) framework are proposed to tackle two classic commonsense reasoning tasks, Winograd Schema challenges (WSC) and Pronoun Disambiguation (PDP). Expand
Story Generation with Commonsense Knowledge Graphs and Axioms
  • 2021
Humans can understand stories, and the rich interactions between agents, locations, and events, seamlessly. However, state-of-the-art reasoning models struggle with understanding, completing, orExpand
How Commonsense Knowledge Helps with Natural Language Tasks: A Survey of Recent Resources and Methodologies
  • Yubo Xie, P. Pu
  • Computer Science
  • ArXiv
  • 2021
TLDR
An overview of commonsense reasoning in natural language processing, which requires a deeper understanding of the contexts and usually involves inference over implicit external knowledge, is given. Expand
CHARET: Character-centered Approach to Emotion Tracking in Stories
TLDR
This research explores how to leverage current state-of-the-art tools to make inferences about the emotional state of a character in a story as events unfold, in a coherent way, and proposes a character role-labelling approach to emotion tracking that accounts for the semantics of emotions. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 39 REFERENCES
A Corpus and Cloze Evaluation for Deeper Understanding of Commonsense Stories
TLDR
A new framework for evaluating story understanding and script learning: the `Story Cloze Test’, which requires a system to choose the correct ending to a four-sentence story, and a new corpus of 50k five- Sentence commonsense stories, ROCStories, to enable this evaluation. Expand
LSDSem 2017 Shared Task: The Story Cloze Test
The LSDSem’17 shared task is the Story Cloze Test, a new evaluation for story understanding and script learning. This test provides a system with a four-sentence story and two possible endings, andExpand
Modeling Reportable Events as Turning Points in Narrative
TLDR
A change-based model of narrative is presented that tracks changes in formality, affect, and other characteristics over the course of a story, and this model is used in distant supervision and selftraining experiments that achieve significant improvements over the baselines at the task of identifying MREs. Expand
PersonaBank: A Corpus of Personal Narratives and Their Story Intention Graphs
We present a new corpus, PersonaBank, consisting of 108 personal stories from weblogs that have been annotated with their Story Intention Graphs, a deep representation of the fabula of a story. WeExpand
Neural Net Models of Open-domain Discourse Coherence
TLDR
This work describes domain-independent neural models of discourse coherence that are capable of measuring multiple aspects of coherence in existing sentences and can maintain coherence while generating new sentences, marking an initial step in generating coherent texts given discourse contexts. Expand
Controlling Linguistic Style Aspects in Neural Language Generation
TLDR
The method is based on conditioned RNN language model, where the desired content as well as the stylistic parameters serve as conditioning contexts and is successful in generating coherent sentences corresponding to the required linguistic style and content. Expand
Automatically Producing Plot Unit Representations for Narrative Text
TLDR
This research explores whether current NLP technology can be used to automatically produce plot unit representations for narrative text by creating a system called AESOP, which exploits a variety of existing resources to identify affect states and applies "projection rules" to map the affect states onto the characters in a story. Expand
Globally Coherent Text Generation with Neural Checklist Models
TLDR
The neural checklist model is presented, a recurrent neural network that models global coherence by storing and updating an agenda of text strings which should be mentioned somewhere in the output, and demonstrates high coherence with greatly improved semantic coverage of the agenda. Expand
Simulating Action Dynamics with Neural Process Networks
TLDR
This work introduces Neural Process Networks to understand procedural text through (neural) simulation of action dynamics, and complements existing memory architectures with dynamic entity tracking by explicitly modeling actions as state transformers. Expand
CROWDSOURCING A WORD–EMOTION ASSOCIATION LEXICON
TLDR
It is shown how the combined strength and wisdom of the crowds can be used to generate a large, high‐quality, word–emotion and word–polarity association lexicon quickly and inexpensively. Expand
...
1
2
3
4
...