Learning Statistical Scripts with LSTM Recurrent Neural Networks

@inproceedings{Pichotta2016LearningSS,
  title={Learning Statistical Scripts with LSTM Recurrent Neural Networks},
  author={Karl Pichotta and Raymond J. Mooney},
  booktitle={AAAI},
  year={2016}
}
Scripts encode knowledge of prototypical sequences of events. We describe a Recurrent Neural Network model for statistical script learning using Long Short-Term Memory, an architecture which has been demonstrated to work well on a range of Artificial Intelligence tasks. We evaluate our system on two tasks, inferring held-out events from text and inferring novel events from text, substantially outperforming prior approaches on both tasks. 

Figures, Tables, Results, and Topics from this paper.

Key Quantitative Results

  • We evaluate our proposed LSTM script system against a number of baselines on the task of predicting held-out verbs with coreference information about their arguments, showing a 22.6% relative improvement compared to the strongest baseline. Second, we evaluate on the more difficult task of predicting held-out verbs with argument nouns, demonstrating a 64.9% relative improvement over the most competitive baseline.
  • The LSTM-both-ent system demonstrates a 50.0% relative improvement (5.7% absolute improvement) over the current best-published system (2D rewritten all-bigram, evaluated using 4-Tuple event recall at 25).

Citations

Publications citing this paper.
SHOWING 1-10 OF 64 CITATIONS

Investigations of the Properties of Narrative Schemas

VIEW 10 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

FEEL: Featured Event Embedding Learning

VIEW 8 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

Behind the Scenes of an Evolving Event Cloze Test

  • LSDSem@EACL
  • 2017
VIEW 4 EXCERPTS
CITES BACKGROUND
HIGHLY INFLUENCED

Event Representations With Tensor-Based Compositions

VIEW 5 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

Event Representations for Automated Story Generation with Deep Neural Nets

VIEW 4 EXCERPTS
CITES METHODS, BACKGROUND & RESULTS
HIGHLY INFLUENCED

A Survey on Temporal Reasoning for Temporal Information Extraction from Text

Artuur Leeuwenberg, Marie-Francine Moens
  • J. Artif. Intell. Res.
  • 2019

FILTER CITATIONS BY YEAR

2015
2019

CITATION STATISTICS

  • 6 Highly Influenced Citations

  • Averaged 17 Citations per year from 2017 through 2019