Corpus ID: 173188416

Grammar-based Neural Text-to-SQL Generation

@article{Lin2019GrammarbasedNT,
  title={Grammar-based Neural Text-to-SQL Generation},
  author={Kevin Lin and Ben Bogin and Mark Neumann and Jonathan Berant and Matt Gardner},
  journal={ArXiv},
  year={2019},
  volume={abs/1905.13326}
}
The sequence-to-sequence paradigm employed by neural text-to-SQL models typically performs token-level decoding and does not consider generating SQL hierarchically from a grammar. [...] Key Result We analyze these techniques on ATIS and Spider, two challenging text-to-SQL datasets, demonstrating that they yield 14--18\% relative reductions in error.Expand
RECPARSER: A Recursive Semantic Parsing Framework for Text-to-SQL Task
TLDR
This paper proposes a novel recursive semantic parsing framework called RECPARSER to generate the nested SQL query layer-by-layer, and decomposes the complicated nestedSQL query generation problem into several progressive non-nested SQL query generation problems. Expand
A Pilot Study for Chinese SQL Semantic Parsing
TLDR
A Spider dataset for Chinese is built, showing that word-based semantic parser is subject to segmentation errors and cross-lingual word embeddings are useful for text-to-SQL. Expand
PICARD: Parsing Incrementally for Constrained Auto-Regressive Decoding from Language Models
TLDR
This work proposes PICARD1, a method for constraining auto-regressive decoders of language models through incremental parsing that transforms fine-tuned T5 models with passable performance into state-of-the-art solutions. Expand
Representing Schema Structure with Graph Neural Networks for Text-to-SQL Parsing
TLDR
This paper presents an encoder-decoder semantic parser, where the structure of the DB schema is encoded with a graph neural network, and this representation is later used at both encoding and decoding time. Expand
Global Reasoning over Database Structures for Text-to-SQL Parsing
TLDR
This work uses message-passing through a graph neural network to softly select a subset of database constants for the output query, conditioned on the question, and trains a model to rank queries based on the global alignment ofdatabase constants to question words. Expand
SEQ2SEQ VS SKETCH FILLING STRUCTURE FOR NATURAL LANGUAGE TO SQL TRANSLATION
TLDR
This paper puts the light on another way to resolve natural language processing tasks, especially the Natural Language to SQL one using the method of sketch-based decoding which is based on a sketch with holes that the model incrementally tries to fill. Expand
Photon: A Robust Cross-Domain Text-to-SQL System
TLDR
PHOTON is presented, a robust, modular, cross-domain NLIDB that can flag natural language input to which a SQL mapping cannot be immediately determined and effectively improves the robustness of text-to-SQL system against untranslatable user input. Expand
Re-examining the Role of Schema Linking in Text-to-SQL
TLDR
This work provides a schema linking corpus based on the Spider text-to-SQL dataset, and builds a simple BERT-based baseline to perform a data-driven study on schema linking, finding when schema linking is done well, SLSQL demonstrates good performance on Spider despite its structural simplicity. Expand
RYANSQL: Recursively Applying Sketch-based Slot Fillings for Complex Text-to-SQL in Cross-Domain Databases
TLDR
A neural network approach called RYANSQL (Recursively Yielding Annotation Network for SQL) is presented to solve complex Text-to-SQL tasks for cross-domain databases to improve generation performance further. Expand
Exploring Unexplored Generalization Challenges for Cross-Database Semantic Parsing
TLDR
This work re-purpose eight semantic parsing datasets that have been well-studied in the setting where in-domain training data is available, and instead use them as additional evaluation data for XSP systems instead, to uncovers several generalization challenges for cross-database semantic parsing. Expand
...
1
2
3
4
...

References

SHOWING 1-10 OF 34 REFERENCES
A Syntactic Neural Model for General-Purpose Code Generation
TLDR
A novel neural architecture powered by a grammar model to explicitly capture the target syntax as prior knowledge for semantic parsing is proposed, achieving state-of-the-art results that well outperform previous code generation and semantic parsing approaches. Expand
TRANX: A Transition-based Neural Abstract Syntax Parser for Semantic Parsing and Code Generation
TLDR
TRANX is a transition-based neural semantic parser that maps natural language utterances into formal meaning representations (MRs) and is highly generalizable, extensible, and effective, registering strong results compared to existing neural semantic parsers. Expand
SyntaxSQLNet: Syntax Tree Networks for Complex and Cross-Domain Text-to-SQL Task
TLDR
Experimental results show that SyntaxSQLNet can handle a significantly greater number of complex SQL examples than prior work, outperforming the previous state-of-the-art model by 9.5% in exact matching accuracy. Expand
Learning to Map Sentences to Logical Form: Structured Classification with Probabilistic Categorial Grammars
TLDR
A learning algorithm is described that takes as input a training set of sentences labeled with expressions in the lambda calculus and induces a grammar for the problem, along with a log-linear model that represents a distribution over syntactic and semantic analyses conditioned on the input sentence. Expand
Abstract Syntax Networks for Code Generation and Semantic Parsing
Tasks like code generation and semantic parsing require mapping unstructured (or partially structured) inputs to well-formed, executable outputs. We introduce abstract syntax networks, a modelingExpand
Language to Logical Form with Neural Attention
TLDR
This paper presents a general method based on an attention-enhanced encoder-decoder model that encode input utterances into vector representations, and generate their logical forms by conditioning the output sequences or trees on the encoding vectors. Expand
Learning Dependency-Based Compositional Semantics
TLDR
A new semantic formalism, dependency-based compositional semantics (DCS) is developed and a log-linear distribution over DCS logical forms is defined and it is shown that the system obtains comparable accuracies to even state-of-the-art systems that do require annotated logical forms. Expand
Learning to Parse Database Queries Using Inductive Logic Programming
TLDR
Experimental results with a complete database-query application for U.S. geography show that CHILL is able to learn parsers that outperform a preexisting, hand-crafted counterpart, and provide direct evidence of the utility of an empirical approach at the level of a complete natural language application. Expand
Context-dependent Semantic Parsing for Time Expressions
TLDR
This work uses a Combinatory Categorial Grammar to construct compositional meaning representations, while considering contextual cues, such as the document creation time and the tense of the governing verb, to compute the final time values. Expand
Neural Semantic Parsing with Type Constraints for Semi-Structured Tables
TLDR
A new semantic parsing model for answering compositional questions on semi-structured Wikipedia tables with a state-of-the-art accuracy and type constraints and entity linking are valuable components to incorporate in neural semantic parsers. Expand
...
1
2
3
4
...