• Corpus ID: 52271710

IncSQL: Training Incremental Text-to-SQL Parsers with Non-Deterministic Oracles

@article{Shi2018IncSQLTI,
  title={IncSQL: Training Incremental Text-to-SQL Parsers with Non-Deterministic Oracles},
  author={Tianze Shi and Kedar Tatwawadi and Kaushik Chakrabarti and Yi Mao and Oleksandr Polozov and Weizhu Chen},
  journal={ArXiv},
  year={2018},
  volume={abs/1809.05054}
}
We present a sequence-to-action parsing approach for the natural language to SQL task that incrementally fills the slots of a SQL query with feasible actions from a pre-defined inventory. [] Key Result When further combined with the execution-guided decoding strategy, our model sets a new state-of-the-art performance at an execution accuracy of 87.1%.

Figures and Tables from this paper

Data-Anonymous Encoding for Text-to-SQL Generation

This work forms it as a sequential tagging problem and proposes a two-stage anonymization model to learn the semantic relationship between tables and input utterances, which consistently improves performances of different neural semantic parsers and significantly outperforms deterministic approaches.

Neural Semantic Parsing in Low-Resource Settings with Back-Translation and Meta-Learning

This work aims to learn a neural semantic parser when only prior knowledge about a limited number of simple rules is available, without access to either annotated programs or execution results, and improves the accuracy and stability on examples uncovered by rules.

Encoding Database Schemas with Relation-Aware Self-Attention for Text-to-SQL Parsers

This paper uses relation-aware self-attention within the encoder so that it can reason about how the tables and columns in the provided schema relate to each other and use this information in interpreting the question.

GraPPa: Grammar-Augmented Pre-Training for Table Semantic Parsing

GraPPa is an effective pre-training approach for table semantic parsing that learns a compositional inductive bias in the joint representations of textual and tabular data and significantly outperforms RoBERTa-large as the feature representation layers and establishes new state-of-the-art results on all of them.

RAT-SQL: Relation-Aware Schema Encoding and Linking for Text-to-SQL Parsers

This work presents a unified framework, based on the relation-aware self-attention mechanism, to address schema encoding, schema linking, and feature representation within a text-to-SQL encoder and achieves the new state-of-the-art performance on the Spider leaderboard.

Hybrid Ranking Network for Text-to-SQL

A neat approach to leverage pre-trained language models in Text-to-SQL called Hybrid Ranking Network (HydraNet) which breaks down the problem into column-wise ranking and decoding and finally assembles the column- wise outputs into a SQL query by straightforward rules.

SEQ2SEQ VS SKETCH FILLING STRUCTURE FOR NATURAL LANGUAGE TO SQL TRANSLATION

This paper puts the light on another way to resolve natural language processing tasks, especially the Natural Language to SQL one using the method of sketch-based decoding which is based on a sketch with holes that the model incrementally tries to fill.

SQL Generation from Natural Language: A Sequence-to-Sequence Model Powered by the Transformers Architecture and Association Rules

This study presents a Sequence-to-Sequence (Seq2Seq) parsing model for the NL to SQL task, powered by the Transformers Architecture exploring the two Language Models (LM): Text-To-Text Transfer Transformer (T5) and the Multilingual pre-trained Text- to-Text Trans transformer (mT5).

Semantic Evaluation for Text-to-SQL with Distilled Test Suite

We propose test suite accuracy to approximate semantic accuracy for Text-to-SQL models. Our method distills a small test suite of databases that achieves high code coverage for the gold query from a

Editing-Based SQL Query Generation for Cross-Domain Context-Dependent Questions

The interaction history is utilized by editing the previous predicted query to improve the generation quality of SQL queries and the benefit of editing compared with the state-of-the-art baselines which generate SQL from scratch is evaluated.
...

References

SHOWING 1-10 OF 44 REFERENCES

Seq2SQL: Generating Structured Queries from Natural Language using Reinforcement Learning

This work proposes Seq2 SQL, a deep neural network for translating natural language questions to corresponding SQL queries, and releases WikiSQL, a dataset of 80654 hand-annotated examples of questions and SQL queries distributed across 24241 tables fromWikipedia that is an order of magnitude larger than comparable datasets.

Pointing Out SQL Queries From Text

Span-Based Constituency Parsing with a Structure-Label System and Provably Optimal Dynamic Oracles

A new shift-reduce system whose stack contains merely sentence spans, represented by a bare minimum of LSTM features, which is the first provably optimal dynamic oracle for constituency parsing, which runs in amortized O(1) time, compared to O(n^3) oracles for standard dependency parsing.

Execution-Guided Neural Program Decoding

A neural semantic parser that translates natural language questions into executable SQL queries with two key ideas, including an encoder-decoder model, and using the execution semantics of SQL to repair decoded programs that result in runtime error or return empty result.

A Dynamic Oracle for Arc-Eager Dependency Parsing

This work uses an improved oracle for the arc-eager transition system to train a deterministic left-to-right dependency parser that is less sensitive to error propagation and outperforms greedy parsers trained using conventional oracles on a range of data sets.

TypeSQL: Knowledge-Based Type-Aware Neural Text-to-SQL Generation

This paper presents a novel approach TypeSQL which formats the problem as a slot filling task in a more reasonable way and utilizes type information to better understand rare entities and numbers in the questions.

Learning a Neural Semantic Parser from User Feedback

We present an approach to rapidly and easily build natural language interfaces to databases for new domains, whose performance improves over time based on user feedback, and requires minimal

Natural Language to Structured Query Generation via Meta-Learning

This work explores a different learning protocol that treats each example as a unique pseudo-task, by reducing the original learning problem to a few-shot meta-learning scenario with the help of a domain-dependent relevance function.

Training Deterministic Parsers with Non-Deterministic Oracles

Experimental evaluation on a wide range of data sets clearly shows that using dynamic oracles to train greedy parsers gives substantial improvements in accuracy, unlike other techniques like beam search.

Neural Semantic Parsing with Type Constraints for Semi-Structured Tables

A new semantic parsing model for answering compositional questions on semi-structured Wikipedia tables with a state-of-the-art accuracy and type constraints and entity linking are valuable components to incorporate in neural semantic parsers.