DREAM: A Challenge Data Set and Models for Dialogue-Based Reading Comprehension

@article{Sun2019DREAMAC,
  title={DREAM: A Challenge Data Set and Models for Dialogue-Based Reading Comprehension},
  author={Kai Sun and Dian Yu and Jianshu Chen and Dong Yu and Yejin Choi and Claire Cardie},
  journal={Transactions of the Association for Computational Linguistics},
  year={2019},
  volume={7},
  pages={217-231}
}
We present DREAM, the first dialogue-based multiple-choice reading comprehension data set. [...] Key MethodWe apply several popular neural reading comprehension models that primarily exploit surface information within the text and find them to, at best, just barely outperform a rule-based approach. We next investigate the effects of incorporating dialogue structure and different kinds of general world knowledge into both rule-based and (neural and non-neural) machine learning-based reading comprehension models…Expand Abstract

Citations

Publications citing this paper.
SHOWING 1-10 OF 17 CITATIONS

ReClor: A Reading Comprehension Dataset Requiring Logical Reasoning

VIEW 8 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

Dual Multi-head Co-attention for Multi-choice Reading Comprehension

VIEW 13 EXCERPTS
CITES BACKGROUND
HIGHLY INFLUENCED

A New Multi-choice Reading Comprehension Dataset for Curriculum Learning

VIEW 6 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

Beyond English-only Reading Comprehension: Experiments in Zero-Shot Multilingual Transfer for Bulgarian

VIEW 9 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

Evidence Sentence Extraction for Machine Reading Comprehension

VIEW 7 EXCERPTS
CITES BACKGROUND & METHODS

Finding Generalizable Evidence by Learning to Convince Q&A Models

VIEW 4 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

FriendsQA: Open-Domain Question Answering on TV Show Transcripts

VIEW 3 EXCERPTS
CITES BACKGROUND
HIGHLY INFLUENCED

What does BERT Learn from Multiple-Choice Reading Comprehension Datasets?

VIEW 3 EXCERPTS
CITES BACKGROUND
HIGHLY INFLUENCED

References

Publications referenced by this paper.
SHOWING 1-10 OF 50 REFERENCES

Improving Language Understanding by Generative Pre-Training

VIEW 11 EXCERPTS
HIGHLY INFLUENTIAL

MS MARCO: A Human Generated MAchine Reading COmprehension Dataset

VIEW 4 EXCERPTS
HIGHLY INFLUENTIAL