Corpus ID: 236428903

Graph-free Multi-hop Reading Comprehension: A Select-to-Guide Strategy

@article{Wu2021GraphfreeMR,
  title={Graph-free Multi-hop Reading Comprehension: A Select-to-Guide Strategy},
  author={Bohong Wu and Zhuosheng Zhang and Hai Zhao},
  journal={ArXiv},
  year={2021},
  volume={abs/2107.11823}
}
Multi-hop reading comprehension (MHRC) requires not only to predict the correct answer span in the given passage, but also to provide a chain of supporting evidences for reasoning interpretability. It is natural to model such a process into graph structure by understanding multi-hop reasoning as jumping over entity nodes, which has made graph modelling dominant on this task. Recently, there have been dissenting voices about whether graph modelling is indispensable due to the inconvenience of… Expand
Representation Decoupling for Open-Domain Passage Retrieval
  • Bohong Wu, Zhuosheng Zhang, Jinyuan Wang, Hai Zhao
  • Computer Science
  • 2021
TLDR
This work proposes to solve the influence of conflicts in the widely used CL strategy in ODPR by decoupling the passage representations into contextual sentence-level ones, and design specific CL strategies to mediate these conflicts. Expand
Paradigm Shift in Natural Language Processing
  • Tianxiang Sun, Xiangyang Liu, Xipeng Qiu, Xuanjing Huang
  • Computer Science
  • ArXiv
  • 2021
TLDR
This paper highlights several paradigms that have the potential to solve different NLP tasks, including the sequence labeling paradigm, and the classification paradigm, which has achieved great success on many tasks. Expand

References

SHOWING 1-10 OF 42 REFERENCES
Multi-hop Reading Comprehension across Multiple Documents by Reasoning over Heterogeneous Graphs
TLDR
This paper introduces a heterogeneous graph with different types of nodes and edges, which is named as Heterogeneous Document-Entity (HDE) graph, which contains different granularity levels of information including candidates, documents and entities in specific document contexts. Expand
Exploring Graph-structured Passage Representation for Multi-hop Reading Comprehension with Graph Neural Networks
TLDR
A new method for better connecting global evidence is introduced, which forms more complex graphs compared to DAGs, and Experiments on two standard datasets show that richer global information leads to better answers. Expand
Cognitive Graph for Multi-Hop Reading Comprehension at Scale
TLDR
The implementation based on BERT and graph neural network efficiently handles millions of documents for multi-hop reasoning questions in the HotpotQA fullwiki dataset, achieving a winning joint F_1 score of 34.9 on the leaderboard. Expand
Dynamically Fused Graph Network for Multi-hop Reasoning
TLDR
D Dynamically Fused Graph Network is proposed, a novel method to answer those questions requiring multiple scattered evidence and reasoning over them, Inspired by human’s step-by-step reasoning behavior. Expand
Multi-hop Reading Comprehension through Question Decomposition and Rescoring
TLDR
A system that decomposes a compositional question into simpler sub-questions that can be answered by off-the-shelf single-hop RC models is proposed and a new global rescoring approach is introduced that considers each decomposition to select the best final answer, greatly improving overall performance. Expand
Dual Co-Matching Network for Multi-choice Reading Comprehension
TLDR
Dual co-matching network (DCMN) is proposed which models the relationship among passage, question and answer options bidirectionally and integrates two reading strategies into the model which obtains state-of-the-art results on five multi-choice reading comprehension datasets. Expand
Select, Answer and Explain: Interpretable Multi-hop Reading Comprehension over Multiple Documents
TLDR
This paper proposes an effective and interpretable Select, Answer and Explain (SAE) system to solve the multi-document RC problem and achieves top competitive performance in distractor setting compared to other existing systems on the leaderboard. Expand
Constructing Datasets for Multi-hop Reading Comprehension Across Documents
TLDR
A novel task to encourage the development of models for text understanding across multiple documents and to investigate the limits of existing methods, in which a model learns to seek and combine evidence — effectively performing multihop, alias multi-step, inference. Expand
A Simple Yet Strong Pipeline for HotpotQA
TLDR
This paper presents a simple pipeline based on BERT that outperforms large-scale language models on both question answering and support identification on HotpotQA (and achieves performance very close to a RoBERTa model). Expand
Question Answering by Reasoning Across Documents with Graph Convolutional Networks
TLDR
A neural model which integrates and reasons relying on information spread within documents and across multiple documents is introduced, which achieves state-of-the-art results on a multi-document question answering dataset, WikiHop. Expand
...
1
2
3
4
5
...