Attention-over-Attention Neural Networks for Reading Comprehension

@inproceedings{Cui2017AttentionoverAttentionNN,
  title={Attention-over-Attention Neural Networks for Reading Comprehension},
  author={Yiming Cui and Zhipeng Chen and Si Wei and Shijin Wang and Ting Liu and Guoping Hu},
  booktitle={ACL},
  year={2017}
}
Cloze-style queries are representative problems in reading comprehension. Over the past few months, we have seen much progress that utilizing neural network approach to solve Cloze-style questions. In this paper, we present a novel model called attention-over-attention reader for the Cloze-style reading comprehension task. Our model aims to place another attention mechanism over the document-level attention, and induces "attended attention" for final predictions. Unlike the previous works, our… CONTINUE READING

Similar Papers

Figures, Tables, and Topics from this paper.

Citations

Publications citing this paper.
SHOWING 1-10 OF 142 CITATIONS

DIM Reader: Dual Interaction Model for Machine Comprehension

VIEW 7 EXCERPTS
CITES METHODS, RESULTS & BACKGROUND
HIGHLY INFLUENCED

BioRead: A New Dataset for Biomedical Reading Comprehension

VIEW 9 EXCERPTS
CITES METHODS, RESULTS & BACKGROUND
HIGHLY INFLUENCED

Neural Attention Reader for Video Comprehension

VIEW 13 EXCERPTS
CITES BACKGROUND
HIGHLY INFLUENCED

Subword-augmented Embedding for Cloze Reading Comprehension

VIEW 9 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

Contextual Aware Joint Probability Model Towards Question Answering System

VIEW 14 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

FILTER CITATIONS BY YEAR

2016
2019

CITATION STATISTICS

  • 25 Highly Influenced Citations

  • Averaged 43 Citations per year from 2017 through 2019

References

Publications referenced by this paper.
SHOWING 1-10 OF 19 REFERENCES