• Corpus ID: 135465922

Contextualized Word Embeddings Enhanced Event Temporal Relation Extraction for Story Understanding

@article{Han2019ContextualizedWE,
  title={Contextualized Word Embeddings Enhanced Event Temporal Relation Extraction for Story Understanding},
  author={Rujun Han and Mengyue Liang and Bashar Alhafni and Nanyun Peng},
  journal={ArXiv},
  year={2019},
  volume={abs/1904.11942}
}
Learning causal and temporal relationships between events is an important step towards deeper story and commonsense understanding. Though there are abundant datasets annotated with event relations for story comprehension, many have no empirical results associated with them. In this work, we establish strong baselines for event temporal relation extraction on two under-explored story narrative datasets: Richer Event Description (RED) and Causal and Temporal Relation Scheme (CaTeRS). To the best… 

Figures and Tables from this paper

Temporal Embeddings and Transformer Models for Narrative Text Understanding

TLDR
Overall, deep learning models appear to be suitable for narrative text understanding, while also providing a challenging and unexploited benchmark for general natural language understanding.

Deep Structured Neural Network for Event Temporal Relation Extraction

TLDR
Experimental results on three high-quality event temporal relation datasets demonstrate that incorporated with pre-trained contextualized embeddings, the proposed model achieves significantly better performances than the state-of-the-art methods on all three datasets.

Temporal Relation Extraction with Joint Semantic and Syntactic Attention

TLDR
The JSSA (Joint Semantic and Syntactic Attention) model is proposed, a method that combines both coarse-grained information from semantic level and fine-graining information from syntactic level, and utilizes neighbor triples of events on syntactic dependency trees and events triple to construct syntactic attention.

ECONET: Effective Continual Pretraining of Language Models for Event Temporal Reasoning

TLDR
A continual pre-training approach that equips PTLMs with targeted knowledge about event temporal relations and design self-supervised learning objectives to recover masked-out event and temporal indicators and to discriminate sentences from their corrupted counterparts.

DEER: A Data Efficient Language Model for Event Temporal Reasoning

TLDR
This work proposes DEER, a language model that is trained to focus on event temporal relations and performs better under low-resource settings than original LMs and uses a generator-discriminator structure to reinforce the LMs' capability of event temporal reasoning.

TIMERS: Document-level Temporal Relation Extraction

TLDR
The proposed model leverages rhetorical discourse features and temporal arguments from semantic role labels, in addition to traditional local syntactic features, trained through a Gated Relational-GCN to achieve superior results on document-level temporal relation classification.

Syntax-aware Multi-task Graph Convolutional Networks for Biomedical Relation Extraction

TLDR
A novel graph convolutional networks model is proposed that incorporates dependency parsing and contextualized embedding to effectively capture comprehensive contextual information in biomedical relation extraction.

DocTime: A Document-level Temporal Dependency Graph Parser

We introduce DocTime - a novel temporal dependency graph (TDG) parser that takes as input a text document and produces a temporal dependency graph. It outperforms previous BERT-based solutions by a

Analytical, Symbolic and First-Order Reasoning within Neural Architectures

TLDR
This paper examines whether PLMs learn aspects of symbolic and first-order logic relations as a side effect of learning word prediction, and introduces Logic and Knowledge Natural Language Inference (LAKNLI), a new NLI task, and probes two different PLMs: one fine-tuned on NLI tasks and the other without NLI fine- tuning.

Entity Relation Extraction Based on Entity Indicators

TLDR
Task-related entity indicators are designed to enable a deep neural network to concentrate on the task-relevant information and achieve state-of-the-art performance on the ACE Chinese corpus, ACE English corpus and Chinese literature text corpus.

References

SHOWING 1-10 OF 35 REFERENCES

CaTeRS: Causal and Temporal Relation Scheme for Semantic Annotation of Event Structures

TLDR
A novel semantic annotation framework, called Causal and Temporal Relation Scheme (CaTeRS), which is unique in simultaneously capturing a comprehensive set of temporal and causal relations between events.

A Corpus and Evaluation Framework for Deeper Understanding of Commonsense Stories

TLDR
A new framework for evaluating story understanding and script learning: the 'Story Cloze Test', which requires a system to choose the correct ending to a four-sentence story, and a new corpus of ~50k five- Sentence commonsense stories, ROCStories, to enable this evaluation.

Richer Event Description: Integrating event coreference with temporal, causal and bridging annotation

TLDR
The annotation methodology for the Richer Event Descriptions corpus is described, which annotates entities, events, times, their coreference and partial coreference relations, and the temporal, causal and subevent relationships between the events.

An Annotation Framework for Dense Event Ordering

TLDR
This paper proposes a new annotation process with a mechanism to force annotators to label connected graphs that generates 10 times more relations per document than the TimeBank, and its TimeBank-Dense corpus is larger than all current corpora.

Classifying Temporal Relations by Bidirectional LSTM over Dependency Paths

TLDR
This work borrows a state-of-the-art method in relation extraction by adopting bidirectional long short-term memory along dependency paths (DP), and makes a “common root” assumption to extend DP representations of cross-sentence links.

Timelines from Text: Identification of Syntactic Temporal Relations

TLDR
A linguistically motivated approach to extracting temporal structure necessary to build a timeline by selecting all pairs of events in the TimeBank that participated in verb-clause constructions and annotating them with the labels before, overlap and after is proposed.

Dense Event Ordering with a Multi-Pass Architecture

TLDR
New experiments on strongly connected event graphs that contain ∼10 times more relations per document than the TimeBank are presented and a shift away from the single learner to a sieve-based architecture that naturally blends multiple learners into a precision-ranked cascade of sieves is described.

Context-Aware Neural Model for Temporal Information Extraction

TLDR
The GCL model has long-term memory and attention mechanisms to resolve irregular long-distance dependencies that regular RNNs such as LSTM cannot recognize, and is the first model to use NTM-like architecture to process the information from global context in discourse-scale natural text processing.

Joint Inference for Event Timeline Construction

TLDR
An algorithmic approach is presented that jointly optimizes the temporal structure of a news article based on time intervals by coupling local classifiers that predict associations and temporal relations between pairs of temporal entities with global constraints.

Temporal Information Extraction for Question Answering Using Syntactic Dependencies in an LSTM-based Architecture

TLDR
This paper proposes to use a set of simple, uniform in architecture LSTM-based models to recover different kinds of temporal relations from text using the shortest dependency path between entities as input, and conducts intrinsic evaluation and post state-of-the-art results on Timebank-Dense.