• Corpus ID: 233297028

Question Decomposition with Dependency Graphs

@article{Hasson2021QuestionDW,
  title={Question Decomposition with Dependency Graphs},
  author={Matan Hasson and Jonathan Berant},
  journal={ArXiv},
  year={2021},
  volume={abs/2104.08647}
}
QDMR is a meaning representation for complex questions, which decomposes questions into a sequence of atomic steps. While stateof-the-art QDMR parsers use the common sequence-to-sequence (seq2seq) approach, a QDMR structure fundamentally describes labeled relations between spans in the input question, and thus dependency-based approaches seem appropriate for this task. In this work, we present a QDMR parser that is based on dependency graphs (DGs), where nodes in the graph are words and edges… 

A Transition-based Method for Complex Question Understanding

This work treats QDMR as a computational graph and proposes a transition-based method where a decider predicts a sequence of actions to build the graph node-by-node, enabling better representation of the intermediate states and better interpretability.

DGR: Decomposition Graph Reconstruction for Question Understanding

A Decomposition Graph Reconstruction (DGR) model is proposed to induce the information by introducing two additional tasks: simple question type classification and dependency relationship detection.

SPARQLing Database Queries from Intermediate Question Decompositions

This work observes that the execution accuracy of queries constructed by the model on the challenging Spider dataset is comparable with the state-of-the-art text-to-SQL methods trained with annotated SQL queries.

Weakly Supervised Mapping of Natural Language to SQL through Question Decomposition

This work uses the recently proposed question decomposition representation called QDMR, an intermediate between NL and formal query languages, and uses NL-QDMR pairs, along with the question answers, as supervision for automatically synthesizing SQL queries.

Learning To Retrieve Prompts for In-Context Learning

This work proposes an efficient method for retrieving prompts for in-context learning using annotated data and an LM, and trains an efficient dense retriever from this data, which is used to retrieve training examples as prompts at test time.

References

SHOWING 1-10 OF 32 REFERENCES

Break It Down: A Question Understanding Benchmark

This work introduces a Question Decomposition Meaning Representation (QDMR) for questions, and demonstrates the utility of QDMR by showing that it can be used to improve open-domain question answering on the HotpotQA dataset, and can be deterministically converted to a pseudo-SQL formal language, which can alleviate annotation in semantic parsing applications.

Simpler but More Accurate Semantic Dependency Parsing

The LSTM-based syntactic parser of Dozat and Manning (2017) is extended to train on and generate graph structures that aim to capture between-word relationships that are more closely related to the meaning of a sentence, using graph-structured representations.

Unsupervised Question Decomposition for Question Answering

An algorithm for One-to-N Unsupervised Sequence transduction (ONUS) that learns to map one hard, multi-hop question to many simpler, single-hop sub-questions, which is promising for shedding light on why a QA system makes a prediction.

Compositional Semantic Parsing on Semi-Structured Tables

This paper proposes a logical-form driven parsing algorithm guided by strong typing constraints and shows that it obtains significant improvements over natural baselines and is made publicly available.

RAT-SQL: Relation-Aware Schema Encoding and Linking for Text-to-SQL Parsers

This work presents a unified framework, based on the relation-aware self-attention mechanism, to address schema encoding, schema linking, and feature representation within a text-to-SQL encoder and achieves the new state-of-the-art performance on the Spider leaderboard.

The Web as a Knowledge-Base for Answering Complex Questions

This paper proposes to decompose complex questions into a sequence of simple questions, and compute the final answer from the sequence of answers, and empirically demonstrates that question decomposition improves performance from 20.8 precision@1 to 27.5 precision @1 on this new dataset.

Multi-hop Reading Comprehension through Question Decomposition and Rescoring

A system that decomposes a compositional question into simpler sub-questions that can be answered by off-the-shelf single-hop RC models is proposed and a new global rescoring approach is introduced that considers each decomposition to select the best final answer, greatly improving overall performance.

GQA: A New Dataset for Real-World Visual Reasoning and Compositional Question Answering

We introduce GQA, a new dataset for real-world visual reasoning and compositional question answering, seeking to address key shortcomings of previous VQA datasets. We have developed a strong and

HotpotQA: A Dataset for Diverse, Explainable Multi-hop Question Answering

It is shown that HotpotQA is challenging for the latest QA systems, and the supporting facts enable models to improve performance and make explainable predictions.

HybridQA: A Dataset of Multi-Hop Question Answering over Tabular and Textual Data

HybridQA is presented, a new large-scale question-answering dataset that requires reasoning on heterogeneous information and can serve as a challenging benchmark to study question answering withheterogeneous information.