Simple and Effective Multi-Paragraph Reading Comprehension

  title={Simple and Effective Multi-Paragraph Reading Comprehension},
  author={Christopher Clark and Matt Gardner},
We introduce a method of adapting neural paragraph-level question answering models to the case where entire documents are given as input. Most current question answering models cannot scale to document or multi-document input, and naively applying these models to each paragraph independently often results in them being distracted by irrelevant text. We show that it is possible to significantly improve performance by using a modified training scheme that teaches the model to ignore non-answer… CONTINUE READING
Highly Cited
This paper has 60 citations. REVIEW CITATIONS
This paper has been referenced on Twitter 44 times. VIEW TWEETS

From This Paper

Topics from this paper.


Publications citing this paper.
Showing 1-10 of 50 citations

60 Citations

Citations per Year
Semantic Scholar estimates that this publication has 60 citations based on the available data.

See our FAQ for additional information.


Publications referenced by this paper.
Showing 1-10 of 33 references

YodaQA: A Modular Question Answering System Pipeline

  • Petr Baudiš.
  • POSTER -19th International Student Conference on…
  • 2015
Highly Influential
2 Excerpts