Decoupling Structure and Lexicon for Zero-Shot Semantic Parsing

@inproceedings{Herzig2018DecouplingSA,
  title={Decoupling Structure and Lexicon for Zero-Shot Semantic Parsing},
  author={Jonathan Herzig and Jonathan Berant},
  booktitle={EMNLP},
  year={2018}
}
Building a semantic parser quickly in a new domain is a fundamental challenge for conversational interfaces, as current semantic parsers require expensive supervision and lack the ability to generalize to new domains. [...] Key Method First, we map an utterance to an abstract, domain-independent, logical form that represents the structure of the logical form, but contains slots instead of KB constants. Then, we replace slots with KB constants via lexical alignment scores and global inference. Our model reaches…Expand
Zero-Shot Semantic Parsing for Instructions
TLDR
This work introduces a new training algorithm that aims to train a semantic parser on examples from a set of source domains, so that it can effectively parse instructions from an unknown target domain, and integrates into the floating parser of Pasupat and Liang (2015), and further augment the parser with features and a logical form candidate filtering logic, to support zero-shot adaptation. Expand
Few-Shot Semantic Parsing for New Predicates
TLDR
This work proposed to apply a designated meta-learning method to train the model, regularize attention scores with alignment statistics, and apply a smoothing technique in pretraining, which consistently outperforms all the baselines in both one and two-shot settings. Expand
Practical Semantic Parsing for Spoken Language Understanding
TLDR
A transfer learning framework for executable semantic parsing is built and it is shown that the framework is effective for Question Answering (Q&A) as well as for Spoken Language Understanding (SLU). Expand
Learning to Map Frequent Phrases to Sub-Structures of Meaning Representation for Neural Semantic Parsing
TLDR
This paper proposes that the vocabulary-mismatch problem can be effectively resolved by leveraging appropriate logical tokens, and exploits macro actions, which are of the same granularity of words/phrases, and allow the model to learn mappings from frequent phrases to corresponding sub-structures of meaning representation. Expand
Zero-Shot Cross-lingual Semantic Parsing
TLDR
This work proposes a multi-task encoder-decoder model to transfer parsing knowledge to additional languages using only English-Logical form paired data and unlabeled, monolingual utterances in each test language. Expand
Look-up and Adapt: A One-shot Semantic Parser
TLDR
A semantic parser that generalizes to out-of-domain examples by learning a general strategy for parsing an unseen utterance through adapting the logical forms of seen utterances, instead of learning to generate a logical form from scratch. Expand
Localizing Q&A Semantic Parsers for Any Language in a Day
TLDR
The proposed Semantic Parser Localizer (SPL), a toolkit that leverages Neural Machine Translation (NMT) systems to localize a semantic parser for a new language, enables any software developer to add a newlanguage capability to any QA system for anew domain in less than 24 hours. Expand
StructVAE: Tree-structured Latent Variable Models for Semi-supervised Semantic Parsing
TLDR
StructVAE is introduced, a variational auto-encoding model for semi-supervised semantic parsing, which learns both from limited amounts of parallel data, and readily-available unlabeled NL utterances, and outperforms strong supervised models. Expand
Lifecycle of neural semantic parsing
TLDR
An improved neural semantic parser is improved, which produces syntactically valid logical forms following a transition system and grammar constrains, and is extended to a weakly-supervised setting within a parser-ranker framework. Expand
Localizing Open-Ontology QA Semantic Parsers in a Day Using Machine Translation
TLDR
This work proposes Semantic Parser Localizer (SPL), a toolkit that leverages Neural Machine Translation (NMT) systems to localize a semantic parser for a new language by augmenting machine-translated datasets with local entities scraped from public websites. Expand
...
1
2
3
4
...

References

SHOWING 1-10 OF 36 REFERENCES
Neural Semantic Parsing over Multiple Knowledge-bases
TLDR
This paper finds that it can substantially improve parsing accuracy by training a single sequence-to-sequence model over multiple KBs, when providing an encoding of the domain at decoding time. Expand
Towards Zero-Shot Frame Semantic Parsing for Domain Scaling
TLDR
This paper proposes a deep learning based approach that can utilize only the slot description in context without the need for any labeled or unlabeled in-domain examples, to quickly bootstrap a new domain. Expand
Cross-domain Semantic Parsing via Paraphrasing
TLDR
By converting logical forms into canonical utterances in natural language, semantic parsing is reduced to paraphrasing, and an attentive sequence-to-sequence paraphrase model is developed that is general and flexible to adapt to different domains. Expand
Building a Semantic Parser Overnight
TLDR
A new methodology is introduced that uses a simple grammar to generate logical forms paired with canonical utterances that are meant to cover the desired set of compositional operators and uses crowdsourcing to paraphrase these canonical utterance into natural utterances. Expand
Semantic Parsing via Paraphrasing
TLDR
This paper presents two simple paraphrase models, an association model and a vector space model, and trains them jointly from question-answer pairs, improving state-of-the-art accuracies on two recently released question-answering datasets. Expand
Zero-shot semantic parser for spoken language understanding
TLDR
This work presents a novel zero-shot learning method, based on word embeddings, allowing to derive a full semantic parser for spoken language understanding, and shows that this model can reach instantly performance comparable to those obtained by either state-of-the-art carefully handcrafted rule-based or trained statistical models for extraction of dialog acts on the Dialog State Tracking test datasets. Expand
Coarse-to-Fine Decoding for Neural Semantic Parsing
TLDR
This work proposes a structure-aware neural architecture which decomposes the semantic parsing process into two stages, and shows that this approach consistently improves performance, achieving competitive results despite the use of relatively simple decoders. Expand
Driving Semantic Parsing from the World’s Response
TLDR
This paper develops two novel learning algorithms capable of predicting complex structures which only rely on a binary feedback signal based on the context of an external world and reformulates the semantic parsing problem to reduce the dependency of the model on syntactic patterns, thus allowing the parser to scale better using less supervision. Expand
Transfer Learning for Neural Semantic Parsing
TLDR
This paper proposes using sequence-to-sequence in a multi- task setup for semantic parsing with focus on transfer learning and shows that the multi-task setup aids transfer learning from an auxiliary task with large labeled data to the target task with smaller labeled data. Expand
Weakly Supervised Semantic Parsing with Abstract Examples
TLDR
This work proposes that in closed worlds with clear semantic types, one can substantially alleviate problems by utilizing an abstract representation, where tokens in both the language utterance and program are lifted to an abstract form and results in sharing between different examples that alleviates the difficulties in training. Expand
...
1
2
3
4
...