• Corpus ID: 213182690

Temporal Embeddings and Transformer Models for Narrative Text Understanding

  title={Temporal Embeddings and Transformer Models for Narrative Text Understanding},
  author={K. Vani and Simone Mellace and Alessandro Antonucci},
We present two deep learning approaches to narrative text understanding for character relationship modelling. The temporal evolution of these relations is described by dynamic word embeddings, that are designed to learn semantic changes over time. An empirical analysis of the corresponding character trajectories shows that such approaches are effective in depicting dynamic evolution. A supervised learning approach based on the state-of-the-art transformer model BERT is used instead to detect… 

Figures and Tables from this paper

Relation Clustering in Narrative Knowledge Graphs

Preliminary tests show that such clustering might successfully detect similar relations, and provide a valuable preprocessing for semi-supervised approaches.

Words with Consistent Diachronic Usage Patterns are Learned Earlier: A Computational Analysis Using Temporally Aligned Word Embeddings

A unique and reliable relation between measures of language change and age of acquisition (AoA) is shown while controlling for frequency, contextual diversity, concreteness, length, dominant part of speech, orthographic neighborhood density, and diachronic frequency variation.

Computational Understanding of Narratives: A Survey

The goal is to document and discuss methods to efficiently construct, extract, and detect evolving online narratives, as well as extensive discussion on open research challenges and goals in the definition, identification, construction, generation, and representation of online narratives.

Report on the third international workshop on narrative extraction from texts (Text2Story 2020)

The Third International Workshop on Narrative Extraction from Texts (Text2Story'20) was held on the 14th of April 2020, in conjunction with the 42nd European Conference on Information Retrieval (ECIR 2020).

Stories from Blogs: Computational Extraction and Visualization of Narratives

A narrative visualization tool is demonstrated that provides an analyst the ability to identify prominent themes and associated narratives in fringe narratives and is available for public use through the Blogtrackers application.

Extracting Impact Model Narratives from Social Services' Text

This is the first NER task specifically targeted at social service entities, and it is shown how this approach can be used for the sequencing of services and impacted clients with information extracted from unstructured text.



Temporal Word Embeddings for Narrative Understanding

This work proposes temporal word embeddings as a suitable tool to study the evolution of characters and their sentiments across the plot of a narrative text, and proposes an alternative initialization procedure which seems to be especially suited for the case of narrative text.

NOVEL2GRAPH: Visual Summaries of Narrative Text Enhanced by Machine Learning

A machine learning approach to the creation of visual summaries for narrative text is presented and allows for a richer representation of texts of this kind.

Contextualized Word Embeddings Enhanced Event Temporal Relation Extraction for Story Understanding

This work establishes strong baselines for event temporal relation extraction on two under-explored story narrative datasets: Richer Event Description (RED) and Causal and Temporal Relation Scheme (CaTeRS) and demonstrates that neural network-based models can outperform some strong traditional linguistic feature- based models.

Modeling Evolving Relationships Between Characters in Literary Novels

This work proposes a semi-supervised framework to learn relationship sequences from fully as well as partially labeled data, and presents a Markovian model capable of accumulating historical beliefs about the relationship and status changes.

Training Temporal Word Embeddings with a Compass

A new heuristic to train temporal word embeddings based on the Word2vec model consists in using atemporal vectors as a reference, i.e., as a compass, when training the representations specific to a given time interval.

Story Ending Prediction by Transferable BERT

This study investigates a transferable BERT (TransBERT) training framework, which can transfer not only general language knowledge from large-scale unlabeled data but also specific kinds of knowledge from various semantically related supervised tasks, for a target task.

Deep Contextualized Word Representations

A new type of deep contextualized word representation is introduced that models both complex characteristics of word use and how these uses vary across linguistic contexts, allowing downstream models to mix different types of semi-supervision signals.

Dynamic Word Embeddings

Experimental results on three different corpora demonstrate that the dynamic model infers word embedding trajectories that are more interpretable and lead to higher predictive likelihoods than competing methods that are based on static models trained separately on time slices.

The Actor-Topic Model for Extracting Social Networks in Literary Narrative

We present a generative model for conversational dialogues, namely the actortopic model (ACTM), that extend the author-topic model (Rosen-Zvi, et.al, 2004) to identify actors of given conversation in

Diachronic word embeddings and semantic shifts: a survey

This paper surveys the current state of academic research related to diachronic word embeddings and semantic shifts detection, and proposes several axes along which these methods can be compared, and outlines the main challenges before this emerging subfield of NLP.