Everything Happens for a Reason: Discovering the Purpose of Actions in Procedural Text

@article{Dalvi2019EverythingHF,
  title={Everything Happens for a Reason: Discovering the Purpose of Actions in Procedural Text},
  author={Bhavana Dalvi and Niket Tandon and Antoine Bosselut and Wen-tau Yih and Peter Clark},
  journal={ArXiv},
  year={2019},
  volume={abs/1909.04745}
}
Our goal is to better comprehend procedural text, e.g., a paragraph about photosynthesis, by not only predicting what happens, but why some actions need to happen before others. [] Key Method We present our new model (XPAD) that biases effect predictions towards those that (1) explain more of the actions in the paragraph and (2) are more plausible with respect to background knowledge.

Figures and Tables from this paper

Procedural Reading Comprehension with Attribute-Aware Context Flow
TLDR
An algorithm for procedural reading comprehension is introduced by translating the text into a general formalism that represents processes as a sequence of transitions over entity attributes (e.g., location, temperature).
Reasoning over Entity-Action-Location Graph for Procedural Text Understanding
TLDR
This paper proposes a novel approach (REAL) to procedural text understanding, where a general framework is built to systematically model the entity-entity, entity-action, and entity-location relations using a graph neural network and develops algorithms for graph construction, representation learning, and state and location tracking.
Recent Trends in Natural Language Understanding for Procedural Knowledge
  • Dena F. Mujtaba, N. Mahapatra
  • Computer Science
    2019 International Conference on Computational Science and Computational Intelligence (CSCI)
  • 2019
TLDR
This paper seeks to provide an overview of the work in procedural knowledge understanding, and information extraction, acquisition, and representation with procedures, to promote discussion and provide a better understanding of procedural knowledge applications and future challenges.
Relational Gating for "What If" Reasoning
TLDR
A novel relational gating network that learns to filter the key entities and relationships and learns contextual and cross representations of both procedure and question for finding the answer to the ”What if...” questions.
Modeling Temporal-Modal Entity Graph for Procedural Multimodal Machine Comprehension
TLDR
A heterogeneous graph structure is formulated to capture textual and visual entities and trace their temporal-modal evolution and propose a novel Temporal-Modal Entity Graph (TMEG), which can better evaluate the generalization of TMEG.
Time-Stamped Language Model: Teaching Language Models to Understand The Flow of Events
TLDR
A Time-Stamped Language Model (TSLM) is proposed to encode event information in LMs architecture by introducing the timestamp encoding to enable pre-trained transformer-based language models to be used on other QA benchmarks by adapting those to the procedural text understanding.
Procedural Text Understanding via Scene-Wise Evolution
TLDR
A new scene-wise paradigm for procedural text understanding is proposed, which jointly tracks states of all entities in a scene-by-scene manner and introduces a series of dynamically evolving scene graphs to jointly formulate the evolution of entities, states and their associations throughout the narrative.
LEMON: Language-Based Environment Manipulation via Execution-Guided Pre-training
TLDR
This work proposes LEMON, a general framework for language-based environment manipulation tasks, a unified approach to deal with various environments using the same generative language model, and an execution-guided pre-training strategy to inject prior knowledge of environments to the language model with a pure synthetic pre- training corpus.
Factoring Statutory Reasoning as Language Understanding Challenges
TLDR
Models for statutory reasoning are shown to benefit from the additional structure found in Prolog programs, improving on prior baselines, and the decomposition into subtasks facilitates finer-grained model diagnostics and clearer incremental progress.
Towards Modeling Revision Requirements in wikiHow Instructions
TLDR
This work extends an existing resource of textual edits with a complementary set of approx.
...
1
2
...

References

SHOWING 1-10 OF 35 REFERENCES
What Happened? Leveraging VerbNet to Predict the Effects of Actions in Procedural Text
TLDR
This work leverages VerbNet to build a rulebase of the preconditions and effects of actions, and uses it along with commonsense knowledge of persistence to answer questions about change in paragraphs describing processes.
Reasoning about Actions and State Changes by Injecting Commonsense Knowledge
TLDR
This paper shows how the predicted effects of actions in the context of a paragraph can be improved in two ways: by incorporating global, commonsense constraints (e.g., a non-existent entity cannot be destroyed), and by biasing reading with preferences from large-scale corpora.
Tracking State Changes in Procedural Text: a Challenge Dataset and Models for Process Paragraph Comprehension
TLDR
A new dataset and models for comprehending paragraphs about processes, an important genre of text describing a dynamic world, are presented and two new neural models that exploit alternative mechanisms for state prediction are introduced, in particular using LSTM input encoding and span prediction.
Towards AI-Complete Question Answering: A Set of Prerequisite Toy Tasks
TLDR
This work argues for the usefulness of a set of proxy tasks that evaluate reading comprehension via question answering, and classify these tasks into skill sets so that researchers can identify (and then rectify) the failings of their systems.
Building Dynamic Knowledge Graphs from Text using Machine Reading Comprehension
TLDR
A neural machine-reading model that constructs dynamic knowledge graphs recurrently for each step of the described procedure, and uses them to track the evolving states of participant entities to present some evidence that the model’s knowledge graphs help it to impose commonsense constraints on its predictions.
Bidirectional Attention Flow for Machine Comprehension
TLDR
The BIDAF network is introduced, a multi-stage hierarchical process that represents the context at different levels of granularity and uses bi-directional attention flow mechanism to obtain a query-aware context representation without early summarization.
Learning Procedures from Text: Codifying How-to Procedures in Deep Neural Networks
TLDR
To identify the relationships, this paper proposes an end-to-end neural network architecture, which can selectively learn important procedure-specific relationships and outperforms the existing entity relationship extraction algorithms.
Creating Causal Embeddings for Question Answering with Minimal Supervision
TLDR
This work argues that a better approach is to look for answers that are related to the question in a relevant way, according to the information need of the question, which may be determined through task-specific embeddings, and implements causality as a use case.
Inducing Neural Models of Script Knowledge
TLDR
This work induces common sense knowledge about prototypical sequence of events in the form of graphs based on distributed representations of predicates and their arguments, and then these representations are used to predict prototypical event orderings.
Modeling Biological Processes for Reading Comprehension
TLDR
This paper focuses on a new reading comprehension task that requires complex reasoning over a single document, and demonstrates that answering questions via predicted structures substantially improves accuracy over baselines that use shallower representations.
...
1
2
3
4
...