Attention-over-Attention Neural Networks for Reading Comprehension

@inproceedings{Cui2017AttentionoverAttentionNN,
  title={Attention-over-Attention Neural Networks for Reading Comprehension},
  author={Yiming Cui and Z. Chen and Si Wei and Shijin Wang and T. Liu and Guoping Hu},
  booktitle={ACL},
  year={2017}
}
Cloze-style queries are representative problems in reading comprehension. [...] Key Method Unlike the previous works, our neural network model requires less pre-defined hyper-parameters and uses an elegant architecture for modeling. Experimental results show that the proposed attention-over-attention model significantly outperforms various state-of-the-art systems by a large margin in public datasets, such as CNN and Children's Book Test datasets.Expand
Mnemonic Reader for Machine Comprehension
Multi-Head Bidirectional Attention for MRC
Examination-Style Reading Comprehension with Neural augmented Retrieval
Bidirectional Attention Flow for Machine Comprehension
Hierarchical Attention Flow for Multiple-Choice Reading Comprehension
Contextual Recurrent Units for Cloze-style Reading Comprehension
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 26 REFERENCES
Consensus Attention-based Neural Networks for Chinese Reading Comprehension
Gated-Attention Readers for Text Comprehension
Iterative Alternating Neural Attention for Machine Reading
Text Understanding with the Attention Sum Reader Network
Bidirectional Attention Flow for Machine Comprehension
A Thorough Examination of the CNN/Daily Mail Reading Comprehension Task
Natural Language Comprehension with the EpiReader
The Goldilocks Principle: Reading Children's Books with Explicit Memory Representations
Teaching Machines to Read and Comprehend
Dynamic Coattention Networks For Question Answering
...
1
2
3
...