Everything Happens for a Reason: Discovering the Purpose of Actions in Procedural Text

@inproceedings{Dalvi2019EverythingHF,
  title={Everything Happens for a Reason: Discovering the Purpose of Actions in Procedural Text},
  author={Bhavana Dalvi and Niket Tandon and Antoine Bosselut and Wen-tau Yih and Peter Clark},
  booktitle={Conference on Empirical Methods in Natural Language Processing},
  year={2019}
}
Our goal is to better comprehend procedural text, e.g., a paragraph about photosynthesis, by not only predicting what happens, but why some actions need to happen before others. [] Key Method We present our new model (XPAD) that biases effect predictions towards those that (1) explain more of the actions in the paragraph and (2) are more plausible with respect to background knowledge.

Figures and Tables from this paper

Procedural Reading Comprehension with Attribute-Aware Context Flow

An algorithm for procedural reading comprehension is introduced by translating the text into a general formalism that represents processes as a sequence of transitions over entity attributes (e.g., location, temperature).

Reasoning over Entity-Action-Location Graph for Procedural Text Understanding

This paper proposes a novel approach (REAL) to procedural text understanding, where a general framework is built to systematically model the entity-entity, entity-action, and entity-location relations using a graph neural network and develops algorithms for graph construction, representation learning, and state and location tracking.

Knowledge-Aware Procedural Text Understanding with Multi-Stage Training

A novel KnOwledge-Aware proceduraL text understAnding (KoaLa) model is proposed, which effectively leverages multiple forms of external knowledge in this task of procedural text understanding and achieves state-of-the-art performance in comparison to various baselines.

Show Me More Details: Discovering Hierarchies of Procedures from Semi-structured Web Data

This work develops a simple and efficient method that links steps in an article to other articles with similar goals, recursively constructing an open-domain hierarchical knowledge-base of procedures based on wikiHow, a website containing more than 110k instructional articles.

Learning Action Conditions from Instructional Manuals for Instruction Understanding

This work proposes a weakly supervised approach to automatically construct large-scale training in-stances from online instructional manuals, and curates a densely human-annotated and vali-dated dataset to study how well the current NLP models can infer action-condition dependencies in the instruction texts.

EvEntS ReaLM: Event Reasoning of Entity States via Language Models

The results indicate that the prompting technique is especially useful for unseen attributes (out-of-domain) or when only limited data is available and that proper model prompting can dramatically improve performance of reported baseline results across multiple tasks.

Reasoning about Actions over Visual and Linguistic Modalities: A Survey

This paper surveys existing tasks, benchmark datasets, various techniques and models, and their respec-tive performance concerning advancements in RAC in the vision and language domain and outlines potential directions for future research.

Recent Trends in Natural Language Understanding for Procedural Knowledge

  • Dena F. MujtabaN. Mahapatra
  • Computer Science
    2019 International Conference on Computational Science and Computational Intelligence (CSCI)
  • 2019
This paper seeks to provide an overview of the work in procedural knowledge understanding, and information extraction, acquisition, and representation with procedures, to promote discussion and provide a better understanding of procedural knowledge applications and future challenges.

Life is a Circus and We are the Clowns: Automatically Finding Analogies between Situations and Processes

An interpretable, scalable algorithm is developed that can extract analogies from a large dataset of procedural texts, achieving 79% precision, and it is demonstrated that the algorithm is robust to paraphrasing the input texts.

Relational Gating for "What If" Reasoning

A novel relational gating network that learns to filter the key entities and relationships and learns contextual and cross representations of both procedure and question for finding the answer to the ”What if...” questions.

References

SHOWING 1-10 OF 35 REFERENCES

What Happened? Leveraging VerbNet to Predict the Effects of Actions in Procedural Text

This work leverages VerbNet to build a rulebase of the preconditions and effects of actions, and uses it along with commonsense knowledge of persistence to answer questions about change in paragraphs describing processes.

Reasoning about Actions and State Changes by Injecting Commonsense Knowledge

This paper shows how the predicted effects of actions in the context of a paragraph can be improved in two ways: by incorporating global, commonsense constraints (e.g., a non-existent entity cannot be destroyed), and by biasing reading with preferences from large-scale corpora.

Tracking State Changes in Procedural Text: a Challenge Dataset and Models for Process Paragraph Comprehension

A new dataset and models for comprehending paragraphs about processes, an important genre of text describing a dynamic world, are presented and two new neural models that exploit alternative mechanisms for state prediction are introduced, in particular using LSTM input encoding and span prediction.

Towards AI-Complete Question Answering: A Set of Prerequisite Toy Tasks

This work argues for the usefulness of a set of proxy tasks that evaluate reading comprehension via question answering, and classify these tasks into skill sets so that researchers can identify (and then rectify) the failings of their systems.

Building Dynamic Knowledge Graphs from Text using Machine Reading Comprehension

A neural machine-reading model that constructs dynamic knowledge graphs recurrently for each step of the described procedure, and uses them to track the evolving states of participant entities to present some evidence that the model’s knowledge graphs help it to impose commonsense constraints on its predictions.

Bidirectional Attention Flow for Machine Comprehension

The BIDAF network is introduced, a multi-stage hierarchical process that represents the context at different levels of granularity and uses bi-directional attention flow mechanism to obtain a query-aware context representation without early summarization.

Learning Procedures from Text: Codifying How-to Procedures in Deep Neural Networks

To identify the relationships, this paper proposes an end-to-end neural network architecture, which can selectively learn important procedure-specific relationships and outperforms the existing entity relationship extraction algorithms.

Creating Causal Embeddings for Question Answering with Minimal Supervision

This work argues that a better approach is to look for answers that are related to the question in a relevant way, according to the information need of the question, which may be determined through task-specific embeddings, and implements causality as a use case.

Inducing Neural Models of Script Knowledge

This work induces common sense knowledge about prototypical sequence of events in the form of graphs based on distributed representations of predicates and their arguments, and then these representations are used to predict prototypical event orderings.

Modeling Biological Processes for Reading Comprehension

This paper focuses on a new reading comprehension task that requires complex reasoning over a single document, and demonstrates that answering questions via predicted structures substantially improves accuracy over baselines that use shallower representations.