Improving Compositional Generalization in Semantic Parsing

@inproceedings{Oren2020ImprovingCG,
  title={Improving Compositional Generalization in Semantic Parsing},
  author={Inbar Oren and Jonathan Herzig and Nitish Gupta and Matt Gardner and Jonathan Berant},
  booktitle={FINDINGS},
  year={2020}
}
Generalization of models to out-of-distribution (OOD) data has captured tremendous attention recently. Specifically, compositional generalization, i.e., whether a model generalizes to new structures built of components observed during training, has sparked substantial interest. In this work, we investigate compositional generalization in semantic parsing, a natural test-bed for compositional generalization, as output programs are constructed from sub-components. We analyze a wide variety of… Expand
Compositional Generalization and Natural Language Variation: Can a Semantic Parsing Approach Handle Both?
TLDR
This work demonstrates that strong existing semantic parsing approaches do not yet perform well across a broad set of evaluations, and proposes NQG-T5, a hybrid model that outperforms existing approaches across several compositional generalization challenges, while also being competitive with the state-of-the-art on standard evaluations. Expand
Compositional Generalization via Semantic Tagging
TLDR
This work decomposes decoding into two phases where an input utterance is first tagged with semantic symbols representing the meanings of its individual words, and then a sequence-to-sequence model is used to predict the final meaning representation conditioning on the utterance and the predicted tag sequence. Expand
Finding needles in a haystack: Sampling Structurally-diverse Training Sets from Synthetic Data for Compositional Generalization
Modern semantic parsers suffer from two principal limitations. First, training requires expensive collection of utterance-program pairs. Second, semantic parsers fail to generalize at test time toExpand
Compositional Generalization for Neural Semantic Parsing via Span-level Supervised Attention
We describe a span-level supervised attention loss that improves compositional generalization in semantic parsers. Our approach builds on existing losses that encourage attention maps in neuralExpand
Unlocking Compositional Generalization in Pre-trained Models Using Intermediate Representations
TLDR
It is highlighted that intermediate representations provide an important and potentially overlooked degree of freedom for improving the compositional generalization abilities of pre-trained seq2seq models. Expand
Multilingual Compositional Wikidata Questions
TLDR
This work proposes a method for creating a multilingual, parallel dataset of question-query pairs, grounded in Wikidata, and introduces such a dataset called CompositionalWikidata Questions (CWQ), and utilizes this data to train and evaluate semantic parsers for Hebrew, Kannada, Chinese and English, to better understand the current strengths and weaknesses of multilingual semantic parsing. Expand
Sequence-to-Sequence Learning with Latent Neural Grammars
Sequence-to-sequence learning with neural networks has become the de facto standard for sequence prediction tasks. This approach typically models the local distribution over the next word with aExpand
A Survey on Semantic Parsing for Machine Programming
Over the last decade, the fields of natural language processing and understanding have seen major advances. Yet, while some of these techniques can be applied to programming language processing andExpand
Toward Code Generation: A Survey and Lessons from Semantic Parsing
TLDR
An overview of the growing body of research in semantic parsing works from an evolutionary perspective, with specific analyses on neuro-symbolic methods, architecture, and supervision is presented. Expand
Complex Knowledge Base Question Answering: A Survey
Knowledge base question answering (KBQA) aims to answer a question over a knowledge base (KB). Early studies mainly focused on answering simple questions over KBs and achieved great success. However,Expand
...
1
2
...

References

SHOWING 1-10 OF 50 REFERENCES
Data Recombination for Neural Semantic Parsing
TLDR
Data recombination improves the accuracy of the RNN model on three semantic parsing datasets, leading to new state-of-the-art performance on the standard GeoQuery dataset for models with comparable supervision. Expand
Neural Semantic Parsing with Type Constraints for Semi-Structured Tables
TLDR
A new semantic parsing model for answering compositional questions on semi-structured Wikipedia tables with a state-of-the-art accuracy and type constraints and entity linking are valuable components to incorporate in neural semantic parsers. Expand
CLOSURE: Assessing Systematic Generalization of CLEVR Models
TLDR
Surprisingly, it is found that an explicitly compositional Neural Module Network model also generalizes badly on CLOSURE, even when it has access to the ground-truth programs at test time. Expand
Neural Module Networks for Reasoning over Text
TLDR
This work extends Neural module networks by introducing modules that reason over a paragraph of text, performing symbolic reasoning over numbers and dates in a probabilistic and differentiable manner, and proposing an unsupervised auxiliary loss to help extract arguments associated with the events in text. Expand
Systematic Generalization: What Is Required and Can It Be Learned?
TLDR
The findings show that the generalization of modular models is much more systematic and that it is highly sensitive to the module layout, i.e. to how exactly the modules are connected, whereas systematic generalization in language understanding may require explicit regularizers or priors. Expand
Extending a Parser to Distant Domains Using a Few Dozen Partially Annotated Examples
TLDR
It is shown that recent advances in word representations greatly diminish the need for domain adaptation when the target domain is syntactically similar to the source domain, and a simple way to adapt a parser using only dozens of partial annotations is provided. Expand
Learning to generalize to new compositions in image understanding
TLDR
It is argued that structured representations and compositional splits are a useful benchmark for image captioning, and advocate compositional models that capture linguistic and visual structure. Expand
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
TLDR
A new language representation model, BERT, designed to pre-train deep bidirectional representations from unlabeled text by jointly conditioning on both left and right context in all layers, which can be fine-tuned with just one additional output layer to create state-of-the-art models for a wide range of tasks. Expand
Rearranging the Familiar: Testing Compositional Generalization in Recurrent Networks
Systematic compositionality is the ability to recombine meaningful units with regular and predictable outcomes, and it’s seen as key to the human capacity for generalization in language. Recent workExpand
Break It Down: A Question Understanding Benchmark
TLDR
This work introduces a Question Decomposition Meaning Representation (QDMR) for questions, and demonstrates the utility of QDMR by showing that it can be used to improve open-domain question answering on the HotpotQA dataset, and can be deterministically converted to a pseudo-SQL formal language, which can alleviate annotation in semantic parsing applications. Expand
...
1
2
3
4
5
...