Semantic Parsing to Probabilistic Programs for Situated Question Answering

  title={Semantic Parsing to Probabilistic Programs for Situated Question Answering},
  author={Jayant Krishnamurthy and Oyvind Tafjord and Aniruddha Kembhavi},
  booktitle={Conference on Empirical Methods in Natural Language Processing},
Situated question answering is the problem of answering questions about an environment such as an image or diagram. This problem requires jointly interpreting a question and an environment using background knowledge to select the correct answer. We present Parsing to Probabilistic Programs (P3), a novel situated question answering model that can use background knowledge and global features of the question/environment interpretation while retaining efficient approximate inference. Our key… 

Figures and Tables from this paper

Parsing to Programs: A Framework for Situated QA

Parsing to Programs, a framework that combines ideas from parsing and probabilistic programming for situated question answering, is introduced and a system that solves pre-university level Newtonian physics questions is built.

Question Answering as Global Reasoning Over Semantic Abstractions

This work presents the first system that reasons over a wide range of semantic abstractions of the text, which are derived using off-the-shelf, general-purpose, pre-trained natural language modules such as semantic role labelers, coreference resolvers, and dependency parsers.

Learn to Explain: Multimodal Reasoning via Thought Chains for Science Question Answering

This work designs language models to learn to generate lectures and explanations as the chain of thought (CoT) to mimic the multi-hop reasoning process when answering S CIENCE QA questions and explores the upper bound of GPT-3 and shows that CoT helps language models learn from fewer data.

Probabilistic Neural Programs

We present probabilistic neural programs, a framework for program induction that 1 permits flexible specification of both a computational model and inference algo2 rithm while simultaneously enabling

Probabilistic Neural Programs

We present probabilistic neural programs, a framework for program induction that permits flexible specification of both a computational model and inference algorithm while simultaneously enabling the

Neural Event Semantics for Grounded Language Understanding

A new conjunctivist framework, neural event semantics (NES), for compositional grounded language understanding that offers stronger generalization capability than standard function-based compositional frameworks, while improving accuracy over state-of-the-art neural methods on real-world language tasks.

StructVAE: Tree-structured Latent Variable Models for Semi-supervised Semantic Parsing

StructVAE is introduced, a variational auto-encoding model for semi-supervised semantic parsing, which learns both from limited amounts of parallel data, and readily-available unlabeled NL utterances, and outperforms strong supervised models.

A Review on Neural Network Question Answering Systems

This paper presents a review to summarize the state of the art in question answering systems implemented using neural networks, identifies the main research topics and considers the most relevant research challenges.

Towards Interpretation as Natural Logic Abduction

This paper shows how to incorporate Natural Logic into Interpretation as Abduction, and demonstrates that missing, implicit premises of arguments can be recovered by the proposed framework by a manual example walkthrough.

More Accurate Entity Ranking Using Knowledge Graph and Web Corpus

Over and above competitive F1 score, AQQUCN gets the best entity ranking accuracy on two syntax-rich and two syntaxpoor public query workloads amounting to over 8,000 queries, with 16– 18% absolute improvement in mean average precision (MAP), compared to recent systems.



A Multi-World Approach to Question Answering about Real-World Scenes based on Uncertain Input

This work proposes a method for automatically answering questions about images by bringing together recent advances from natural language processing and computer vision by a multi-world approach that represents uncertainty about the perceived world in a bayesian framework.

Semantic Parsing via Staged Query Graph Generation: Question Answering with Knowledge Base

This work proposes a novel semantic parsing framework for question answering using a knowledge base that leverages the knowledge base in an early stage to prune the search space and thus simplifies the semantic matching problem.

Learning Dependency-Based Compositional Semantics

A new semantic formalism, dependency-based compositional semantics (DCS) is developed and a log-linear distribution over DCS logical forms is defined and it is shown that the system obtains comparable accuracies to even state-of-the-art systems that do require annotated logical forms.

Question Answering on Freebase via Relation Extraction and Textual Evidence

This work first presents a neural network based relation extractor to retrieve the candidate answers from Freebase, and then infer over Wikipedia to validate these answers, a substantial improvement over the state-of-the-art.

Large-scale Semantic Parsing without Question-Answer Pairs

This paper introduces a novel semantic parsing approach to query Freebase in natural language without requiring manual annotations or question-answer pairs and converts sentences to semantic graphs using CCG and subsequently grounds them to Freebase guided by denotations as a form of weak supervision.

Modeling Biological Processes for Reading Comprehension

This paper focuses on a new reading comprehension task that requires complex reasoning over a single document, and demonstrates that answering questions via predicted structures substantially improves accuracy over baselines that use shallower representations.

Learning to Compose Neural Networks for Question Answering

A question answering model that applies to both images and structured knowledge bases that uses natural language strings to automatically assemble neural networks from a collection of composable modules that achieves state-of-the-art results on benchmark datasets.

Deep Compositional Question Answering with Neural Module Networks

The approach decomposes questions into their linguistic substructures, and uses these structures to dynamically instantiate modular networks (with reusable components for recognizing dogs, classifying colors, etc.) which compose collections of jointly-trained neural "modules" into deep networks for question answering.

Learning a Compositional Semantics for Freebase with an Open Predicate Vocabulary

An approach to learning a model-theoretic semantics for natural language tied to Freebase using an open predicate vocabulary, enabling it to produce denotations for phrases such as “Republican front-runner from Texas” whose semantics cannot be represented using the Freebase schema is presented.

Weakly Supervised Learning of Semantic Parsers for Mapping Instructions to Actions

This paper shows semantic parsing can be used within a grounded CCG semantic parsing approach that learns a joint model of meaning and context for interpreting and executing natural language instructions, using various types of weak supervision.