Semantic Parsing to Probabilistic Programs for Situated Question Answering

@inproceedings{Krishnamurthy2016SemanticPT,
  title={Semantic Parsing to Probabilistic Programs for Situated Question Answering},
  author={Jayant Krishnamurthy and Oyvind Tafjord and Aniruddha Kembhavi},
  booktitle={EMNLP},
  year={2016}
}
Situated question answering is the problem of answering questions about an environment such as an image or diagram. This problem requires jointly interpreting a question and an environment using background knowledge to select the correct answer. We present Parsing to Probabilistic Programs (P3), a novel situated question answering model that can use background knowledge and global features of the question/environment interpretation while retaining efficient approximate inference. Our key… 

Figures and Tables from this paper

Parsing to Programs: A Framework for Situated QA
TLDR
Parsing to Programs, a framework that combines ideas from parsing and probabilistic programming for situated question answering, is introduced and a system that solves pre-university level Newtonian physics questions is built.
Question Answering as Global Reasoning Over Semantic Abstractions
TLDR
This work presents the first system that reasons over a wide range of semantic abstractions of the text, which are derived using off-the-shelf, general-purpose, pre-trained natural language modules such as semantic role labelers, coreference resolvers, and dependency parsers.
Probabilistic Neural Programs
We present probabilistic neural programs, a framework for program induction that 1 permits flexible specification of both a computational model and inference algo2 rithm while simultaneously enabling
Probabilistic Neural Programs
We present probabilistic neural programs, a framework for program induction that permits flexible specification of both a computational model and inference algorithm while simultaneously enabling the
Neural Event Semantics for Grounded Language Understanding
TLDR
A new conjunctivist framework, neural event semantics (NES), for compositional grounded language understanding that offers stronger generalization capability than standard function-based compositional frameworks, while improving accuracy over state-of-the-art neural methods on real-world language tasks.
A Review on Neural Network Question Answering Systems
TLDR
This paper presents a review to summarize the state of the art in question answering systems implemented using neural networks, identifies the main research topics and considers the most relevant research challenges.
Towards Interpretation as Natural Logic Abduction
TLDR
This paper shows how to incorporate Natural Logic into Interpretation as Abduction, and demonstrates that missing, implicit premises of arguments can be recovered by the proposed framework by a manual example walkthrough.
More Accurate Entity Ranking Using Knowledge Graph and Web Corpus
TLDR
Over and above competitive F1 score, AQQUCN gets the best entity ranking accuracy on two syntax-rich and two syntaxpoor public query workloads amounting to over 8,000 queries, with 16– 18% absolute improvement in mean average precision (MAP), compared to recent systems.
Span-based Neural Structured Prediction
TLDR
A series of neural structured-prediction algorithms for natural language processing that model the most basic substructure of language: spans of text are proposed and state-of-the-art models for tasks that require modeling the internal structure of spans, and modeling structure between spans are presented.
Learning to Solve Geometry Problems from Natural Language Demonstrations in Textbooks
TLDR
This paper introduces the task of question answering using natural language demonstrations where the question answering system is provided with detailed demonstrative solutions to questions in natural language.
...
...

References

SHOWING 1-10 OF 47 REFERENCES
Semantic Parsing via Staged Query Graph Generation: Question Answering with Knowledge Base
TLDR
This work proposes a novel semantic parsing framework for question answering using a knowledge base that leverages the knowledge base in an early stage to prune the search space and thus simplifies the semantic matching problem.
Learning Dependency-Based Compositional Semantics
TLDR
A new semantic formalism, dependency-based compositional semantics (DCS) is developed and a log-linear distribution over DCS logical forms is defined and it is shown that the system obtains comparable accuracies to even state-of-the-art systems that do require annotated logical forms.
Question Answering on Freebase via Relation Extraction and Textual Evidence
TLDR
This work first presents a neural network based relation extractor to retrieve the candidate answers from Freebase, and then infer over Wikipedia to validate these answers, a substantial improvement over the state-of-the-art.
Large-scale Semantic Parsing without Question-Answer Pairs
TLDR
This paper introduces a novel semantic parsing approach to query Freebase in natural language without requiring manual annotations or question-answer pairs and converts sentences to semantic graphs using CCG and subsequently grounds them to Freebase guided by denotations as a form of weak supervision.
Modeling Biological Processes for Reading Comprehension
TLDR
This paper focuses on a new reading comprehension task that requires complex reasoning over a single document, and demonstrates that answering questions via predicted structures substantially improves accuracy over baselines that use shallower representations.
Semantic Parsing on Freebase from Question-Answer Pairs
TLDR
This paper trains a semantic parser that scales up to Freebase and outperforms their state-of-the-art parser on the dataset of Cai and Yates (2013), despite not having annotated logical forms.
Deep Compositional Question Answering with Neural Module Networks
TLDR
The approach decomposes questions into their linguistic substructures, and uses these structures to dynamically instantiate modular networks (with reusable components for recognizing dogs, classifying colors, etc.) which compose collections of jointly-trained neural "modules" into deep networks for question answering.
Learning a Compositional Semantics for Freebase with an Open Predicate Vocabulary
TLDR
An approach to learning a model-theoretic semantics for natural language tied to Freebase using an open predicate vocabulary, enabling it to produce denotations for phrases such as “Republican front-runner from Texas” whose semantics cannot be represented using the Freebase schema is presented.
Weakly Supervised Learning of Semantic Parsers for Mapping Instructions to Actions
TLDR
This paper shows semantic parsing can be used within a grounded CCG semantic parsing approach that learns a joint model of meaning and context for interpreting and executing natural language instructions, using various types of weak supervision.
Scalable Semantic Parsing with Partial Ontologies
TLDR
A new semantic parsing model and semi-supervised learning approach for reasoning with partial ontological support is presented, which allows us to improve precision over strong baselines, while parsing many phrases that would be ignored by existing techniques.
...
...