Learning an Executable Neural Semantic Parser

@article{Cheng2019LearningAE,
  title={Learning an Executable Neural Semantic Parser},
  author={Jianpeng Cheng and Siva Reddy and Vijay A. Saraswat and Mirella Lapata},
  journal={Computational Linguistics},
  year={2019},
  volume={45},
  pages={59-94}
}
This article describes a neural semantic parser that maps natural language utterances onto logical forms that can be executed against a task-specific environment, such as a knowledge base or a database, to produce a response. [...] Key Method The generation process is modeled by structured recurrent neural networks, which provide a rich encoding of the sentential context and generation history for making predictions.Expand
Lifecycle of neural semantic parsing
TLDR
An improved neural semantic parser is improved, which produces syntactically valid logical forms following a transition system and grammar constrains, and is extended to a weakly-supervised setting within a parser-ranker framework. Expand
Weakly-supervised Neural Semantic Parsing with a Generative
Weakly-supervised semantic parsers are trained on utterance-denotation pairs, treating logical forms as latent. The task is challenging due to the large search space and spuriousness of logicalExpand
Weakly-Supervised Neural Semantic Parsing with a Generative Ranker
TLDR
A neural parser-ranker system for weakly-supervised semantic parsing that generates candidate tree-structured logical forms from utterances using clues of denotations and uses a neurally encoded lexicon to inject prior domain knowledge to the model. Expand
Value-Agnostic Conversational Semantic Parsing
TLDR
This work proposes a model that abstracts over values to focus prediction on type and function-level context, which provides a compact encoding of dialogue histories and predicted programs, improving generalization and computational efficiency. Expand
Total Recall: a Customized Continual Learning Method for Neural Semantic Parsers
This paper investigates continual learning for semantic parsing. In this setting, a neural semantic parser learns tasks sequentially without accessing full training data from previous tasks. DirectExpand
Benchmarking Meaning Representations in Neural Semantic Parsing
TLDR
A thorough experimental study on Unimer reveals that neural semantic parsing approaches exhibit notably different performance when they are trained to generate different meaning representations, and a new unified benchmark on meaning representations is proposed. Expand
Polyglot Semantic Parsing in APIs
TLDR
This paper develops a novel graph-based decoding framework that achieves state-of-the-art performance on the above datasets, and applies this method to two other benchmark SP tasks. Expand
Improving Semantic Parsing for Task Oriented Dialog
TLDR
Three different improvements to the semantic parsing model are presented: contextualized embeddings, ensembling, and pairwise re-ranking based on a language model, which gives a new state-of-the-art result on the Task Oriented Parsing (TOP) dataset. Expand
Few-Shot Semantic Parsing for New Predicates
TLDR
This work proposed to apply a designated meta-learning method to train the model, regularize attention scores with alignment statistics, and apply a smoothing technique in pretraining, which consistently outperforms all the baselines in both one and two-shot settings. Expand
SmBoP: Semi-autoregressive Bottom-up Semantic Parsing
The de-facto standard decoding method for semantic parsing in recent years has been to autoregressively decode the abstract syntax tree of the target program using a top-down depth-first traversal.Expand
...
1
2
3
...

References

SHOWING 1-10 OF 94 REFERENCES
Learning Dependency-Based Compositional Semantics
TLDR
A new semantic formalism, dependency-based compositional semantics (DCS) is developed and a log-linear distribution over DCS logical forms is defined and it is shown that the system obtains comparable accuracies to even state-of-the-art systems that do require annotated logical forms. Expand
Weakly Supervised Training of Semantic Parsers
TLDR
This work presents a method for training a semantic parser using only a knowledge base and an unlabeled text corpus, without any individually annotated sentences, and demonstrates recovery of this richer structure by extracting logical forms from natural language queries against Freebase. Expand
Learning Structured Natural Language Representations for Semantic Parsing
We introduce a neural semantic parser that converts natural language utterances to intermediate representations in the form of predicate-argument structures, which are induced with a transitionExpand
Learning a Natural Language Interface with Neural Programmer
TLDR
This paper presents the first weakly supervised, end-to-end neural network model to induce such programs on a real-world dataset, and enhances the objective function of Neural Programmer, a neural network with built-in discrete operations, and applies it on WikiTableQuestions, a natural language question-answering dataset. Expand
Driving Semantic Parsing from the World’s Response
TLDR
This paper develops two novel learning algorithms capable of predicting complex structures which only rely on a binary feedback signal based on the context of an external world and reformulates the semantic parsing problem to reduce the dependency of the model on syntactic patterns, thus allowing the parser to scale better using less supervision. Expand
Joint Syntactic and Semantic Parsing with Combinatory Categorial Grammar
TLDR
This work presents an approach to training a joint syntactic and semantic parser that combines syntactic training information from CCGbank with semanticTraining information from a knowledge base via distant supervision, and demonstrates that this parser produces logical forms better than both comparable prior work and a pipelined syntax-then-semantics approach. Expand
Neural Semantic Parsing with Type Constraints for Semi-Structured Tables
TLDR
A new semantic parsing model for answering compositional questions on semi-structured Wikipedia tables with a state-of-the-art accuracy and type constraints and entity linking are valuable components to incorporate in neural semantic parsers. Expand
Neural Semantic Parsing over Multiple Knowledge-bases
TLDR
This paper finds that it can substantially improve parsing accuracy by training a single sequence-to-sequence model over multiple KBs, when providing an encoding of the domain at decoding time. Expand
Learning to Parse Database Queries Using Inductive Logic Programming
TLDR
Experimental results with a complete database-query application for U.S. geography show that CHILL is able to learn parsers that outperform a preexisting, hand-crafted counterpart, and provide direct evidence of the utility of an empirical approach at the level of a complete natural language application. Expand
Using inductive logic programming to automate the construction of natural language parsers
TLDR
Results support the claim that ILP techniques as implemented in C scHILL represent a viable alternative with significant potential advantages over neural-network, propositional, and probabilistic approaches to empirical parser construction. Expand
...
1
2
3
4
5
...