• Publications
  • Influence
Don’t Parse, Generate! A Sequence to Sequence Architecture for Task-Oriented Semantic Parsing
TLDR
A unified architecture based on Sequence to Sequence models and Pointer Generator Network to handle both simple and complex queries is proposed and achieves state of the art performance on three publicly available datasets. Expand
Transfer Learning for Neural Semantic Parsing
TLDR
This paper proposes using sequence-to-sequence in a multi- task setup for semantic parsing with focus on transfer learning and shows that the multi-task setup aids transfer learning from an auxiliary task with large labeled data to the target task with smaller labeled data. Expand
Multilingual Neural Semantic Parsing for Low-Resourced Languages
TLDR
A new multilingual semantic parsing dataset in English, Italian and Japanese based on the Facebook Task Oriented Parsing (TOP) dataset is introduced and it is shown that joint multilingual training with pretrained encoders substantially outperforms the authors' baselines on the TOP dataset and outperforms the state-of-the-art model on the public NLMaps dataset. Expand
One Semantic Parser to Parse Them All: Sequence to Sequence Multi-Task Learning on Semantic Parsing Datasets
TLDR
This work investigates the use of Multi-Task Learning (MTL) architectures and finds that an MTL architecture that shares the entire network across datasets yields competitive or better parsing accuracies than the single-task baselines, while reducing the total number of parameters by 68%. Expand