Cross-domain Semantic Parsing via Paraphrasing

@inproceedings{Su2017CrossdomainSP,
  title={Cross-domain Semantic Parsing via Paraphrasing},
  author={Yu Su and Xifeng Yan},
  booktitle={EMNLP},
  year={2017}
}
Existing studies on semantic parsing mainly focus on the in-domain setting. We formulate cross-domain semantic parsing as a domain adaptation problem: train a semantic parser on some source domains and then adapt it to the target domain. Due to the diversity of logical forms in different domains, this problem presents unique and intriguing challenges. By converting logical forms into canonical utterances in natural language, we reduce semantic parsing to paraphrasing, and develop an attentive… Expand
Decoupling Structure and Lexicon for Zero-Shot Semantic Parsing
TLDR
This paper introduces a zero-shot approach to semantic parsing that can parse utterances in unseen domains while only being trained on examples in other source domains. Expand
Domain Adaptation for Semantic Parsing
TLDR
This paper proposes a novel semantic parser for domain adaptation, where there are much fewer annotated data in the target domain compared to the source domain, and benefits from a two-stage coarse-to-fine framework. Expand
Practical Semantic Parsing for Spoken Language Understanding
TLDR
A transfer learning framework for executable semantic parsing is built and it is shown that the framework is effective for Question Answering (Q&A) as well as for Spoken Language Understanding (SLU). Expand
Weakly Supervised Multi-task Learning for Semantic Parsing
TLDR
A weakly supervised learning method to enhance the authors' multi-task learning model with paraphrase data, based on the idea that the paraphrased questions should have the same logical form and question type information is proposed. Expand
A Neural Semantic Parser for Math Problems Incorporating Multi-Sentence Information
TLDR
Experimental results show that the proposed two-encoder architecture and word-level selective mechanism could bring significant improvement and can achieve better performance than the state-of-the-art methods. Expand
Improving Semantic Parsing with Neural Generator-Reranker Architecture
TLDR
A generator-reranker architecture for semantic parsing that consists of a pre-processing step for the candidates followed by a novel critic network, which reranks these candidates based on the similarity between each candidate and the input sentence. Expand
Semantic Parsing with Dual Learning
TLDR
This work develops a semantic parsing framework with the dual learning algorithm, which enables a semantic parser to make full use of data through a dual-learning game. Expand
Graph Enhanced Cross-Domain Text-to-SQL Generation
TLDR
This paper improves upon a state-of-the-art Spider model, SyntaxSQLNet, by constructing a graph of column names for all databases and using graph neural networks to compute their embeddings, which offer better cross-domain representations and SQL queries. Expand
TaPas: Weakly Supervised Table Parsing via Pre-training
TLDR
TaPas is presented, an approach to question answering over tables without generating logical forms that outperforms or rivals semantic parsing models by improving state-of-the-art accuracy on SQA and performing on par with the state of theart on WikiSQL and WikiTQ, but with a simpler model architecture. Expand
Zero-Shot Semantic Parsing for Instructions
TLDR
This work introduces a new training algorithm that aims to train a semantic parser on examples from a set of source domains, so that it can effectively parse instructions from an unknown target domain, and integrates into the floating parser of Pasupat and Liang (2015), and further augment the parser with features and a logical form candidate filtering logic, to support zero-shot adaptation. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 55 REFERENCES
Paraphrase Generation from Latent-Variable PCFGs for Semantic Parsing
TLDR
A novel grammar model for paraphrase generation is introduced that does not require any sentence-aligned paraphrase corpus and is able to leverage the flexibility and scalability of latent-variable probabilistic context-free grammars to sample paraphrases. Expand
Neural Semantic Parsing over Multiple Knowledge-bases
TLDR
This paper finds that it can substantially improve parsing accuracy by training a single sequence-to-sequence model over multiple KBs, when providing an encoding of the domain at decoding time. Expand
Semantic Parsing via Paraphrasing
TLDR
This paper presents two simple paraphrase models, an association model and a vector space model, and trains them jointly from question-answer pairs, improving state-of-the-art accuracies on two recently released question-answering datasets. Expand
Semantic Parsing on Freebase from Question-Answer Pairs
TLDR
This paper trains a semantic parser that scales up to Freebase and outperforms their state-of-the-art parser on the dataset of Cai and Yates (2013), despite not having annotated logical forms. Expand
Building a Semantic Parser Overnight
TLDR
A new methodology is introduced that uses a simple grammar to generate logical forms paired with canonical utterances that are meant to cover the desired set of compositional operators and uses crowdsourcing to paraphrase these canonical utterance into natural utterances. Expand
Data Recombination for Neural Semantic Parsing
TLDR
Data recombination improves the accuracy of the RNN model on three semantic parsing datasets, leading to new state-of-the-art performance on the standard GeoQuery dataset for models with comparable supervision. Expand
Dynamic Pooling and Unfolding Recursive Autoencoders for Paraphrase Detection
TLDR
This work introduces a method for paraphrase detection based on recursive autoencoders (RAE) and unsupervised RAEs based on a novel unfolding objective and learns feature vectors for phrases in syntactic trees to measure word- and phrase-wise similarity between two sentences. Expand
Sequence-based Structured Prediction for Semantic Parsing
TLDR
An approach for semantic parsing that uses a recurrent neural network to map a natural language question into a logical form representation of a KB query and shows how grammatical constraints on the derivation sequence can easily be integrated inside the RNN-based sequential predictor. Expand
Language to Logical Form with Neural Attention
TLDR
This paper presents a general method based on an attention-enhanced encoder-decoder model that encode input utterances into vector representations, and generate their logical forms by conditioning the output sequences or trees on the encoding vectors. Expand
Semantic Parsing Freebase: Towards Open-domain Semantic Parsing
TLDR
This paper introduces FreeParser, a system that trains on one domain and one set of predicate and constant symbols, and then can parse sentences for any new domain, including sentences that refer to symbols never seen during training. Expand
...
1
2
3
4
5
...