Context Dependent Semantic Parsing over Temporally Structured Data

@article{Chen2019ContextDS,
  title={Context Dependent Semantic Parsing over Temporally Structured Data},
  author={Charles Chen and Razvan C. Bunescu},
  journal={ArXiv},
  year={2019},
  volume={abs/1905.00245}
}
We describe a new semantic parsing setting that allows users to query the system using both natural language questions and actions within a graphical user interface. Multiple time series belonging to an entity of interest are stored in a database and the user interacts with the system to obtain a better understanding of the entity’s state and behavior, entailing sequences of actions and questions whose answers may depend on previous factual or navigational interactions. We design an LSTM-based… 
Context Dependent Semantic Parsing: A Survey
TLDR
This survey investigates progress on the methods for the context dependent semantic parsing, together with the current datasets and tasks, and points out open problems and challenges in this area.
Chase: A Large-Scale and Pragmatic Chinese Dataset for Cross-Database Context-Dependent Text-to-SQL
TLDR
This work presents CHASE, a large-scale and pragmatic Chinese dataset for XDTS that consists of 5,459 coherent question sequences over 280 databases, in which only 35% of questions are contextindependent, and 28% of SQL queries are easy.

References

SHOWING 1-10 OF 33 REFERENCES
From Physician Queries to Logical Forms for Efficient Exploration of Patient Data
TLDR
A new question answering paradigm in which users can interact with the system using natural language questions or direct actions within a graphical user interface (GUI) is introduced, substantially outperforming standard sequence generation baselines.
Data Recombination for Neural Semantic Parsing
TLDR
Data recombination improves the accuracy of the RNN model on three semantic parsing datasets, leading to new state-of-the-art performance on the standard GeoQuery dataset for models with comparable supervision.
Learning to Map Context-Dependent Sentences to Executable Formal Queries
We propose a context-dependent model to map utterances within an interaction to executable formal queries. To incorporate interaction history, the model maintains an interaction-level encoder that
Language to Logical Form with Neural Attention
TLDR
This paper presents a general method based on an attention-enhanced encoder-decoder model that encode input utterances into vector representations, and generate their logical forms by conditioning the output sequences or trees on the encoding vectors.
Simpler Context-Dependent Logical Forms via Model Projections
TLDR
This work considers the task of learning a context-dependent mapping from utterances to denotations, and performs successive projections of the full model onto simpler models that operate over equivalence classes of logical forms.
Seq2SQL: Generating Structured Queries from Natural Language using Reinforcement Learning
TLDR
This work proposes Seq2 SQL, a deep neural network for translating natural language questions to corresponding SQL queries, and releases WikiSQL, a dataset of 80654 hand-annotated examples of questions and SQL queries distributed across 24241 tables fromWikipedia that is an order of magnitude larger than comparable datasets.
Search-based Neural Structured Learning for Sequential Question Answering
TLDR
This work proposes a novel dynamic neural semantic parsing framework trained using a weakly supervised reward-guided search that effectively leverages the sequential context to outperform state-of-the-art QA systems that are designed to answer highly complex questions.
Learning Context-Dependent Mappings from Sentences to Logical Form
TLDR
An algorithm is developed that maintains explicit, lambda-calculus representations of salient discourse entities and uses a context-dependent analysis pipeline to recover logical forms and a hidden-variable variant of the perception algorithm to learn a linear model used to select the best analysis.
Semantic Parsing for Single-Relation Question Answering
TLDR
A semantic parsing framework based on semantic similarity for open domain question answering (QA) that achieves higher precision across different recall points compared to the previous approach, and can improve F1 by 7 points.
Bootstrapping Semantic Parsers from Conversations
TLDR
This paper introduces a loss function to measure how well potential meanings match the conversation, and induces a weighted CCG grammar that could be used to automatically bootstrap the semantic analysis component in a complete dialog system.
...
1
2
3
4
...