Exploiting Explicit Paths for Multi-hop Reading Comprehension

@inproceedings{Kundu2019ExploitingEP,
  title={Exploiting Explicit Paths for Multi-hop Reading Comprehension},
  author={Souvik Kundu and Tushar Khot and Ashish Sabharwal},
  booktitle={ACL},
  year={2019}
}
We propose a novel, path-based reasoning approach for the multi-hop reading comprehension task where a system needs to combine facts from multiple passages to answer a question. Although inspired by multi-hop reasoning over knowledge graphs, our proposed approach operates directly over unstructured text. It generates potential paths through passages and scores them without any direct path supervision. The proposed model, named PathNet, attempts to extract implicit relations from text through… Expand
Multi-hop Reading Comprehension across Documents with Path-based Graph Convolutional Network
TLDR
This paper constructs a path-based reasoning graph from supporting documents, which contains a new question-aware gating mechanism to regulate the usefulness of information propagating across documents and add question information during reasoning. Expand
Dynamic Reasoning Network for Multi-hop Question Answering
TLDR
A Dynamic Reasoning Network (DRN) is proposed, a novel approach to obtain correct answers by multi-hop reasoning among multiple passages and a query reshaping mechanism which visits a query repeatedly to mimic people’s reading habit. Expand
A Sentence-Based Circular Reasoning Model in Multi-Hop Reading Comprehension
TLDR
This research proposes a Sentence-based Circular Reasoning (SCR) approach, which starts with sentence representation and then unites the query to establish a reasoning path based on a loop inference unit, and proposes a nested mechanism to extend the probability distribution for weighting. Expand
Multi-hop Reading Comprehension across Multiple Documents by Reasoning over Heterogeneous Graphs
TLDR
This paper introduces a heterogeneous graph with different types of nodes and edges, which is named as Heterogeneous Document-Entity (HDE) graph, which contains different granularity levels of information including candidates, documents and entities in specific document contexts. Expand
Learning to Generate Multi-Hop Knowledge Paths for Commonsense Question Answering
  • 2020
Commonsense question answering (QA) requires a model of general background knowledge about how the world operates and how people interact with each other before reasoning. Prior works focus primarilyExpand
Differentiable Reasoning over a Virtual Knowledge Base
TLDR
A neural module, DrKIT, that traverses textual data like a virtual KB, softly following paths of relations between mentions of entities in the corpus, which improves accuracy by 9 points on 3-hop questions in the MetaQA dataset and is very efficient. Expand
VIRTUAL KNOWLEDGE BASE
We consider the task of answering complex multi-hop questions using a corpus as a virtual knowledge base (KB). In particular, we describe a neural module, DrKIT, that traverses textual data like aExpand
Learning Reasoning Paths over Semantic Graphs for Video-grounded Dialogues
TLDR
A novel framework of Reasoning Paths in Dialogue Context (PDC), which discovers information flows among dialogue turns through a semantic graph constructed based on lexical components in each question and answer and learns to predict reasoning paths over this semantic graph. Expand
Coarse-grained decomposition and fine-grained interaction for multi-hop question answering
TLDR
A new model architecture for multi-hop question answering is proposed by applying two completion strategies to decompose complex questions into simple ones without any additional annotations and the experimental results show that the method outperforms state-of-the-art baselines. Expand
Learning to Recover Reasoning Chains for Multi-Hop Question Answering via Cooperative Games
TLDR
A cooperative game approach is proposed to deal with the new problem of learning to recover reasoning chains from weakly supervised signals, in which how the evidence passages are selected and how the selected passages are connected are handled by two models that cooperate to select the most confident chains from a large set of candidates. Expand
...
1
2
3
...

References

SHOWING 1-10 OF 40 REFERENCES
Constructing Datasets for Multi-hop Reading Comprehension Across Documents
TLDR
A novel task to encourage the development of models for text understanding across multiple documents and to investigate the limits of existing methods, in which a model learns to seek and combine evidence — effectively performing multihop, alias multi-step, inference. Expand
Exploring Graph-structured Passage Representation for Multi-hop Reading Comprehension with Graph Neural Networks
TLDR
A new method for better connecting global evidence is introduced, which forms more complex graphs compared to DAGs, and Experiments on two standard datasets show that richer global information leads to better answers. Expand
Question Answering by Reasoning Across Documents with Graph Convolutional Networks
TLDR
A neural model which integrates and reasons relying on information spread within documents and across multiple documents is introduced, which achieves state-of-the-art results on a multi-document question answering dataset, WikiHop. Expand
Compositional Learning of Embeddings for Relation Paths in Knowledge Base and Text
TLDR
This paper proposes the first exact dynamic programming algorithm which enables efficient incorporation of all relation paths of bounded length, while modeling both relation types and intermediate nodes in the compositional path representations. Expand
Question Answering as Global Reasoning Over Semantic Abstractions
TLDR
This work presents the first system that reasons over a wide range of semantic abstractions of the text, which are derived using off-the-shelf, general-purpose, pre-trained natural language modules such as semantic role labelers, coreference resolvers, and dependency parsers. Expand
Bidirectional Attention Flow for Machine Comprehension
TLDR
The BIDAF network is introduced, a multi-stage hierarchical process that represents the context at different levels of granularity and uses bi-directional attention flow mechanism to obtain a query-aware context representation without early summarization. Expand
Chains of Reasoning over Entities, Relations, and Text using Recurrent Neural Networks
TLDR
This paper learns to jointly reason about relations, entities, and entity-types, and uses neural attention modeling to incorporate multiple paths in a single RNN that represents logical composition across all relations. Expand
Higher-order Lexical Semantic Models for Non-factoid Answer Reranking
TLDR
This work introduces a higher-order formalism that allows all these lexical semantic models to chain direct evidence to construct indirect associations between question and answer texts, by casting the task as the traversal of graphs that encode direct term associations. Expand
Traversing Knowledge Graphs in Vector Space
TLDR
It is demonstrated that compositional training acts as a novel form of structural regularization, reliably improving performance across all base models (reducing errors by up to 43%) and achieving new state-of-the-art results. Expand
Semantic Parsing on Freebase from Question-Answer Pairs
TLDR
This paper trains a semantic parser that scales up to Freebase and outperforms their state-of-the-art parser on the dataset of Cai and Yates (2013), despite not having annotated logical forms. Expand
...
1
2
3
4
...