Modeling Biological Processes for Reading Comprehension

@inproceedings{Berant2014ModelingBP,
  title={Modeling Biological Processes for Reading Comprehension},
  author={Jonathan Berant and Vivek Srikumar and Pei-Chun Chen and Abby Vander Linden and Brittany Harding and Brad Huang and Peter Clark and Christopher D. Manning},
  booktitle={EMNLP},
  year={2014}
}
Machine reading calls for programs that read and understand text, but most current work only attempts to extract facts from redundant web-scale corpora. [...] Key Method To answer the questions, we first predict a rich structure representing the process in the paragraph. Then, we map the question to a formal query, which is executed against the predicted structure. We demonstrate that answering questions via predicted structures substantially improves accuracy over baselines that use shallower representations.Expand
Building Dynamic Knowledge Graphs from Text using Machine Reading Comprehension
TLDR
A neural machine-reading model that constructs dynamic knowledge graphs recurrently for each step of the described procedure, and uses them to track the evolving states of participant entities to present some evidence that the model’s knowledge graphs help it to impose commonsense constraints on its predictions.
Multi Document Reading Comprehension
TLDR
A study on Reading Comprehension and its evolution in Natural Language Processing over the past few decades is presented and a recently proposed model for Multi-Document Readingcomprehension — RE3QA that is comprised of a Reader, Retriever, and a Re-ranker based network to fetch the best possible answer from a given set of passages is proposed.
Question Answering as Global Reasoning Over Semantic Abstractions
TLDR
This work presents the first system that reasons over a wide range of semantic abstractions of the text, which are derived using off-the-shelf, general-purpose, pre-trained natural language modules such as semantic role labelers, coreference resolvers, and dependency parsers.
SQuAD Reading Comprehension
One important task in Natural Language Understanding is Reading Comprehension. Given a piece of text, we want to be able to answer any relevant questions. Using Stanford Question Answering
Machine Comprehension with Discourse Relations
TLDR
This approach enables the model to benefit from discourse information without relying on explicit annotations of discourse structure during training, and demonstrates that the discourse aware model outperforms state-of-the-art machine comprehension systems.
Reading Comprehension with Graph-based Temporal-Casual Reasoning
TLDR
This work generates event graphs from text based on dependencies, and rank answers by aligning event graphs that are constrained by graph-based reasoning to ensure temporal and causal agreement.
SQuAD: 100,000+ Questions for Machine Comprehension of Text
TLDR
A strong logistic regression model is built, which achieves an F1 score of 51.0%, a significant improvement over a simple baseline (20%).
Learning Knowledge Graphs for Question Answering through Conversational Dialog
TLDR
This work is the first to acquire knowledge for question-answering from open, natural language dialogs without a fixed ontology or domain model that predetermines what users can say.
Recent Trends in Natural Language Understanding for Procedural Knowledge
TLDR
This paper seeks to provide an overview of the work in procedural knowledge understanding, and information extraction, acquisition, and representation with procedures, to promote discussion and provide a better understanding of procedural knowledge applications and future challenges.
ListReader: Extracting List-form Answers for Opinion Questions
TLDR
ListReader is proposed, a neural extractive QA model for list-form answer that adopts a co-extraction setting that can extract either spanor sentence-level answers, allowing better applicability and Experimental results show that the model considerably outperforms various strong baselines.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 42 REFERENCES
Deep Read: A Reading Comprehension System
TLDR
Initial work on Deep Read, an automated reading comprehension system that accepts arbitrary text input (a story) and answers questions about it is described, with a baseline system that retrieves the sentence containing the answer 30--40% of the time.
Semantic Parsing on Freebase from Question-Answer Pairs
TLDR
This paper trains a semantic parser that scales up to Freebase and outperforms their state-of-the-art parser on the dataset of Cai and Yates (2013), despite not having annotated logical forms.
COGEX: A Logic Prover for Question Answering
TLDR
The idea of automated reasoning applied to question answering is introduced and the feasibility of integrating a logic prover into a Question Answering system is shown.
Driving Semantic Parsing from the World’s Response
TLDR
This paper develops two novel learning algorithms capable of predicting complex structures which only rely on a binary feedback signal based on the context of an external world and reformulates the semantic parsing problem to reduce the dependency of the model on syntactic patterns, thus allowing the parser to scale better using less supervision.
Paraphrase-Driven Learning for Open Question Answering
TLDR
This work demonstrates that it is possible to learn a semantic lexicon and linear ranking function without manually annotating questions and automatically generalizes a seed lexicon, and includes a scalable, parallelized perceptron parameter estimation scheme.
Learning for Semantic Parsing with Statistical Machine Translation
TLDR
It is shown that WASP performs favorably in terms of both accuracy and coverage compared to existing learning methods requiring similar amount of supervision, and shows better robustness to variations in task complexity and word order.
Learning to Map Sentences to Logical Form: Structured Classification with Probabilistic Categorial Grammars
TLDR
A learning algorithm is described that takes as input a training set of sentences labeled with expressions in the lambda calculus and induces a grammar for the problem, along with a log-linear model that represents a distribution over syntactic and semantic analyses conditioned on the input sentence.
Learning Biological Processes with Global Constraints
TLDR
This paper presents the task of process extraction, in which events within a process and the relations between the events are automatically extracted from text and shows significant improvement comparing to baselines that disregard process structure.
MCTest: A Challenge Dataset for the Open-Domain Machine Comprehension of Text
TLDR
MCTest is presented, a freely available set of stories and associated questions intended for research on the machine comprehension of text that requires machines to answer multiple-choice reading comprehension questions about fictional stories, directly tackling the high-level goal of open-domain machine comprehension.
Learning to Automatically Solve Algebra Word Problems
TLDR
An approach for automatically learning to solve algebra word problems by reasons across sentence boundaries to construct and solve a system of linear equations, while simultaneously recovering an alignment of the variables and numbers to the problem text.
...
1
2
3
4
5
...