Learning Structured Natural Language Representations for Semantic Parsing

@article{Cheng2017LearningSN,
  title={Learning Structured Natural Language Representations for Semantic Parsing},
  author={Jianpeng Cheng and Siva Reddy and V. Saraswat and Mirella Lapata},
  journal={ArXiv},
  year={2017},
  volume={abs/1704.08387}
}
We introduce a neural semantic parser that converts natural language utterances to intermediate representations in the form of predicate-argument structures, which are induced with a transition system and subsequently mapped to target domains. [...] Key Method The semantic parser is trained end-to-end using annotated logical forms or their denotations. We obtain competitive results on various datasets. The induced predicate-argument structures shed light on the types of representations useful for semantic…Expand
Learning an Executable Neural Semantic Parser
TLDR
A neural semantic parser that maps natural language utterances onto logical forms that can be executed against a task-specific environment, such as a knowledge base or a database, to produce a response. Expand
Discourse Representation Parsing for Sentences and Documents
TLDR
A neural model equipped with a supervised hierarchical attention mechanism and a linguistically-motivated copy strategy is presented that outperforms competitive baselines by a wide margin and presents a general framework for parsing discourse structures of arbitrary length and granularity. Expand
Discourse Representation Structure Parsing
TLDR
An open-domain neural semantic parser which generates formal meaning representations in the style of Discourse Representation Theory (DRT) and develops a structure-aware model which decomposes the decoding process into three stages. Expand
Coarse-to-Fine Decoding for Neural Semantic Parsing
TLDR
This work proposes a structure-aware neural architecture which decomposes the semantic parsing process into two stages, and shows that this approach consistently improves performance, achieving competitive results despite the use of relatively simple decoders. Expand
Lifecycle of neural semantic parsing
TLDR
An improved neural semantic parser is improved, which produces syntactically valid logical forms following a transition system and grammar constrains, and is extended to a weakly-supervised setting within a parser-ranker framework. Expand
Building a Neural Semantic Parser from a Domain Ontology
TLDR
This work crowdsource training data on six domains, covering both single-turn utterances which exhibit rich compositionality, and sequential utterances where a complex task is procedurally performed in steps, and develops neural semantic parsers which perform such compositional tasks. Expand
Dependency-based Hybrid Trees for Semantic Parsing
TLDR
This work proposes a novel dependency-based hybrid tree model for semantic parsing, which converts natural language utterance into machine interpretable meaning representations and integrates a neural component into the model and proposes an efficient dynamic-programming algorithm to perform tractable inference. Expand
Weakly-supervised Neural Semantic Parsing with a Generative
Weakly-supervised semantic parsers are trained on utterance-denotation pairs, treating logical forms as latent. The task is challenging due to the large search space and spuriousness of logicalExpand
Weakly-Supervised Neural Semantic Parsing with a Generative Ranker
TLDR
A neural parser-ranker system for weakly-supervised semantic parsing that generates candidate tree-structured logical forms from utterances using clues of denotations and uses a neurally encoded lexicon to inject prior domain knowledge to the model. Expand
Exploring Neural Models for Parsing Natural Language into First-Order Logic
TLDR
This work model FOL parsing as a sequence to sequence mapping task where given a natural language sentence, it is encoded into an intermediate representation using an LSTM followed by a decoder which sequentially generates the predicates in the corresponding FOL formula. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 66 REFERENCES
Language to Logical Form with Neural Attention
TLDR
This paper presents a general method based on an attention-enhanced encoder-decoder model that encode input utterances into vector representations, and generate their logical forms by conditioning the output sequences or trees on the encoding vectors. Expand
Learning to Map Sentences to Logical Form: Structured Classification with Probabilistic Categorial Grammars
TLDR
A learning algorithm is described that takes as input a training set of sentences labeled with expressions in the lambda calculus and induces a grammar for the problem, along with a log-linear model that represents a distribution over syntactic and semantic analyses conditioned on the input sentence. Expand
Evaluating Induced CCG Parsers on Grounded Semantic Parsing
We compare the effectiveness of four different syntactic CCG parsers for a semantic slot-filling task to explore how much syntactic supervision is required for downstream semantic analysis. ThisExpand
Learning for Semantic Parsing with Statistical Machine Translation
TLDR
It is shown that WASP performs favorably in terms of both accuracy and coverage compared to existing learning methods requiring similar amount of supervision, and shows better robustness to variations in task complexity and word order. Expand
Weakly Supervised Training of Semantic Parsers
TLDR
This work presents a method for training a semantic parser using only a knowledge base and an unlabeled text corpus, without any individually annotated sentences, and demonstrates recovery of this richer structure by extracting logical forms from natural language queries against Freebase. Expand
Lexical Generalization in CCG Grammar Induction for Semantic Parsing
TLDR
An algorithm for learning factored CCG lexicons, along with a probabilistic parse-selection model, which includes both lexemes to model word meaning and templates to model systematic variation in word usage are presented. Expand
Large-scale Semantic Parsing without Question-Answer Pairs
TLDR
This paper introduces a novel semantic parsing approach to query Freebase in natural language without requiring manual annotations or question-answer pairs and converts sentences to semantic graphs using CCG and subsequently grounds them to Freebase guided by denotations as a form of weak supervision. Expand
Learning Dependency-Based Compositional Semantics
TLDR
A new semantic formalism, dependency-based compositional semantics (DCS) is developed and a log-linear distribution over DCS logical forms is defined and it is shown that the system obtains comparable accuracies to even state-of-the-art systems that do require annotated logical forms. Expand
Semantic Parsing with Semi-Supervised Sequential Autoencoders
TLDR
This work presents a novel semi-supervised approach for sequence transduction and applies it to semantic parsing tasks focusing on domains with limited access to labelled training data and extend those datasets with synthetically generated logical forms. Expand
Semantic Parsing via Paraphrasing
TLDR
This paper presents two simple paraphrase models, an association model and a vector space model, and trains them jointly from question-answer pairs, improving state-of-the-art accuracies on two recently released question-answering datasets. Expand
...
1
2
3
4
5
...