Transfer Learning for Neural Semantic Parsing

@inproceedings{Fan2017TransferLF,
  title={Transfer Learning for Neural Semantic Parsing},
  author={Xing Fan and Emilio Monti and Lambert Mathias and Markus Dreyer},
  booktitle={Rep4NLP@ACL},
  year={2017}
}
The goal of semantic parsing is to map natural language to a machine interpretable meaning representation language (MRL). One of the constraints that limits full exploration of deep learning technologies for semantic parsing is the lack of sufficient annotation training data. In this paper, we propose using sequence-to-sequence in a multi-task setup for semantic parsing with focus on transfer learning. We explore three multi-task architectures for sequence-to-sequence model and compare their… 

Figures and Tables from this paper

Weakly Supervised Multi-task Learning for Semantic Parsing

A weakly supervised learning method to enhance the authors' multi-task learning model with paraphrase data, based on the idea that the paraphrased questions should have the same logical form and question type information is proposed.

One Semantic Parser to Parse Them All: Sequence to Sequence Multi-Task Learning on Semantic Parsing Datasets

This work investigates the use of Multi-Task Learning (MTL) architectures and finds that an MTL architecture that shares the entire network across datasets yields competitive or better parsing accuracies than the single-task baselines, while reducing the total number of parameters by 68%.

Practical Semantic Parsing for Spoken Language Understanding

A transfer learning framework for executable semantic parsing is built and it is shown that the framework is effective for Question Answering (Q&A) as well as for Spoken Language Understanding (SLU).

Lifecycle of neural semantic parsing

An improved neural semantic parser is improved, which produces syntactically valid logical forms following a transition system and grammar constrains, and is extended to a weakly-supervised setting within a parser-ranker framework.

A Novel Multi-Task Learning Framework for Semi-Supervised Semantic Parsing

This work proposes a semi-supervised semantic parsing methods by exploiting unlabeled natural utterances in a novel multi-task learning framework and takes entity sequences as training targets to improve the representations of encoder and reduce entity-mistakes in prediction.

Coarse-to-Fine Decoding for Neural Semantic Parsing

This work proposes a structure-aware neural architecture which decomposes the semantic parsing process into two stages, and shows that this approach consistently improves performance, achieving competitive results despite the use of relatively simple decoders.

Low-Resource Domain Adaptation for Compositional Task-Oriented Semantic Parsing

This work identifies two fundamental factors for low-resource domain adaptation: better representation learning and better training techniques, and proposes a novel method that outperforms a supervised neural model at a 10-fold data reduction.

Statistical learning for semantic parsing: A survey

This paper reviews recent algorithms for semantic parsing including both conventional machine learning approaches and deep learning approaches, and gives an overview of a semantic parsing system, and summary a general way to do semantic parsing in statistical learning.

Translate & Fill: Improving Zero-Shot Multilingual Semantic Parsing with Synthetic Data

Experimental results on three multilingual semantic parsing datasets show that data augmentation with TaF reaches accuracies competitive with similar systems which rely on traditional alignment techniques.

Multitask Parsing Across Semantic Representations

It is shown that multitask learning significantly improves UCCA parsing in both in-domain and out-of-domain settings, and uniform transition-based system and learning architecture for all parsing tasks are experiment on.
...

References

SHOWING 1-10 OF 33 REFERENCES

Neural Semantic Parsing over Multiple Knowledge-bases

This paper finds that it can substantially improve parsing accuracy by training a single sequence-to-sequence model over multiple KBs, when providing an encoding of the domain at decoding time.

Data Recombination for Neural Semantic Parsing

Data recombination improves the accuracy of the RNN model on three semantic parsing datasets, leading to new state-of-the-art performance on the standard GeoQuery dataset for models with comparable supervision.

Language to Logical Form with Neural Attention

This paper presents a general method based on an attention-enhanced encoder-decoder model that encode input utterances into vector representations, and generate their logical forms by conditioning the output sequences or trees on the encoding vectors.

Sequence to Sequence Learning with Neural Networks

This paper presents a general end-to-end approach to sequence learning that makes minimal assumptions on the sequence structure, and finds that reversing the order of the words in all source sentences improved the LSTM's performance markedly, because doing so introduced many short term dependencies between the source and the target sentence which made the optimization problem easier.

Learning Dependency-Based Compositional Semantics

A new semantic formalism, dependency-based compositional semantics (DCS) is developed and a log-linear distribution over DCS logical forms is defined and it is shown that the system obtains comparable accuracies to even state-of-the-art systems that do require annotated logical forms.

Multi-task Sequence to Sequence Learning

The results show that training on a small amount of parsing and image caption data can improve the translation quality between English and German by up to 1.5 BLEU points over strong single-task baselines on the WMT benchmarks, and reveal interesting properties of the two unsupervised learning objectives, autoencoder and skip-thought, in the MTL context.

Lexical Generalization in CCG Grammar Induction for Semantic Parsing

An algorithm for learning factored CCG lexicons, along with a probabilistic parse-selection model, which includes both lexemes to model word meaning and templates to model systematic variation in word usage are presented.

Multi-Task Learning for Multiple Language Translation

The recently proposed neural machine translation model is extended to a multi-task learning framework which shares source language representation and separates the modeling of different target language translation.

Grounded Unsupervised Semantic Parsing

This work presents the first unsupervised approach for semantic parsing that rivals the accuracy of supervised approaches in translating natural-language questions to database queries, and adopts a novel grounded-learning approach to leverage database for indirect supervision.

Grammar as a Foreign Language

The domain agnostic attention-enhanced sequence-to-sequence model achieves state-of-the-art results on the most widely used syntactic constituency parsing dataset, when trained on a large synthetic corpus that was annotated using existing parsers.