CQR-SQL: Conversational Question Reformulation Enhanced Context-Dependent Text-to-SQL Parsers
@article{Xiao2022CQRSQLCQ, title={CQR-SQL: Conversational Question Reformulation Enhanced Context-Dependent Text-to-SQL Parsers}, author={Dongling Xiao and Linzheng Chai and Qian-Wen Zhang and Zhao Yan and Zhoujun Li and Yunbo Cao}, journal={ArXiv}, year={2022}, volume={abs/2205.07686} }
Context-dependent text-to-SQL is the task of translating multi-turn questions into database-related SQL queries. Existing methods typically focus on making full use of history context or previously predicted SQL for currently SQL parsing, while neglecting to explicitly comprehend the schema and conversational dependency, such as co-reference, ellipsis and user focus change. In this paper, we propose CQR-SQL, which uses auxiliary Conversational Question Reformulation (CQR) learning to explicitly…
Figures and Tables from this paper
5 Citations
MIGA: A Unified Multi-task Generation Framework for Conversational Text-to-SQL
- Computer ScienceArXiv
- 2022
A two-stage unified MultI-task Generation frAmework (MIGA) that leverages PLMs' ability to tackle conversational text-to-SQL and proposes four SQL perturbations to alleviate the error propagation problem.
Controllable Data Augmentation for Context-Dependent Text-to-SQL
- Computer Science
- 2023
This paper introduces ConDA, which generates interactive questions and corresponding SQL results, and designs the SQL dialogue state to enhance the data diversity through the state transition, and presents a filter method to ensure the data quality by a grounding model.
A comprehensive evaluation of ChatGPT's zero-shot Text-to-SQL capability
- Computer ScienceArXiv
- 2023
The results demonstrate that ChatGPT has strong text-to-SQL abilities, and there is still a gap from the current state-of-the-art (SOTA) model performance, although that the experiment was conducted in a zero-shot scenario, ChatG PT's performance is still impressive.
Relevant Objectives of Developing SQL Adaptive Learning Technology
- Computer Science2022 12th International Conference on Dependable Systems, Services and Technologies (DESSERT)
- 2022
The goal of the research is to provide comprehensive analysis of the existing adaptive SQL E-Learning systems and formulate actual objectives that could be used in further researches.
Q-TOD: A Query-driven Task-oriented Dialogue System
- Computer ScienceEMNLP
- 2022
This paper introduces a novel query-driven task-oriented dialogue system, namely Q-TOD, which outperforms strong baselines and establishes a new state-of-the-art performance on these datasets.
44 References
Decoupled Dialogue Modeling and Semantic Parsing for Multi-Turn Text-to-SQL
- Computer ScienceFINDINGS
- 2021
This paper proposes a novel decoupled multi-turn Text-to-SQL framework, where an utterance rewrite model first explicitly solves completion of dialogue context, and then a single-turntext- to-SQL parser follows to address the data sparsity problem.
RAT-SQL: Relation-Aware Schema Encoding and Linking for Text-to-SQL Parsers
- Computer ScienceACL
- 2020
This work presents a unified framework, based on the relation-aware self-attention mechanism, to address schema encoding, schema linking, and feature representation within a text-to-SQL encoder and achieves the new state-of-the-art performance on the Spider leaderboard.
Learn to Resolve Conversational Dependency: A Consistency Training Framework for Conversational Question Answering
- Computer ScienceACL
- 2021
A novel framework, ExCorD (Explicit guidance on how to resolve Conversational Dependency) is proposed to enhance the abilities of QA models in comprehending conversational context, while addressing the limitations of the existing approaches.
COCO-LM: Correcting and Contrasting Text Sequences for Language Model Pretraining
- Computer ScienceNeurIPS
- 2021
A self-supervised learning framework, COCO-LM, that pretrains Language Models by COrrecting and COntrasting corrupted text sequences that outperforms recent state-of-the-art pretrained models in accuracy, but also improves pretraining efficiency.
RASAT: Integrating Relational Structures into Pretrained Seq2Seq Model for Text-to-SQL
- Computer ScienceEMNLP
- 2022
This work proposes RASAT: a Transformer seq2seq architecture augmented with relation-aware self-attention that could leverage a variety of relational structures while inheriting the pretrained parameters from the T5 model effectively.
HIE-SQL: History Information Enhanced Network for Context-Dependent Text-to-SQL Semantic Parsing
- Computer ScienceFINDINGS
- 2022
This work proposes a History Information Enhanced text-to-SQL model (HIE-SQL) to exploit context dependence information from both history utterances and the last predicted SQL query, and proposes a bimodal pre-trained model to bridge the gap between them.
UnifiedSKG: Unifying and Multi-Tasking Structured Knowledge Grounding with Text-to-Text Language Models
- Computer ScienceEMNLP
- 2022
The UnifiedSKG framework is proposed, which unifies 21 SKG tasks into a text-to-text format, aiming to promote systematic SKG research, instead of being exclusive to a single task, domain, or dataset.
PICARD: Parsing Incrementally for Constrained Auto-Regressive Decoding from Language Models
- Computer ScienceEMNLP
- 2021
On the challenging Spider and CoSQL text-to-SQL translation tasks, it is shown that PICARD transforms fine-tuned T5 models with passable performance into state-of-the-art solutions.
Dynamic Hybrid Relation Exploration Network for Cross-Domain Context-Dependent Semantic Parsing
- Computer ScienceAAAI
- 2021
This paper presents a dynamic graph framework that is capable of effectively modelling contextual utterances, tokens, database schemas, and their complicated interaction as the conversation proceeds, and employs a dynamic memory decay mechanism that incorporates inductive bias to integrate enriched contextual relation representation.
LGESQL: Line Graph Enhanced Text-to-SQL Model with Mixed Local and Non-Local Relations
- Computer ScienceACL
- 2021
This work proposes a Line Graph Enhanced Text-to-SQL (LGESQL) model to mine the underlying relational features without constructing meta-paths, and designs an auxiliary task called graph pruning to improve the discriminative capability of the encoder.