Corpus ID: 8535316

Bidirectional Attention Flow for Machine Comprehension

@article{Seo2017BidirectionalAF,
  title={Bidirectional Attention Flow for Machine Comprehension},
  author={Minjoon Seo and Aniruddha Kembhavi and Ali Farhadi and Hannaneh Hajishirzi},
  journal={ArXiv},
  year={2017},
  volume={abs/1611.01603}
}
  • Minjoon Seo, Aniruddha Kembhavi, +1 author Hannaneh Hajishirzi
  • Published 2017
  • Computer Science
  • ArXiv
  • Machine comprehension (MC), answering a query about a given context paragraph, requires modeling complex interactions between the context and the query. [...] Key Method In this paper we introduce the Bi-Directional Attention Flow (BIDAF) network, a multi-stage hierarchical process that represents the context at different levels of granularity and uses bi-directional attention flow mechanism to obtain a query-aware context representation without early summarization. Our experimental evaluations show that our…Expand Abstract

    Figures, Tables, and Topics from this paper.

    Unilateral Multi-Perspective Matching for Machine Comprehension
    • Sigberto Alarcon Viesca
    • 2017
    Pay More Attention - Neural Architectures for Question-Answering
    • 1
    • Highly Influenced
    • Open Access
    Question Answering with Gated Attention and Multitask Learning-Option 3 (Graded)
    HIERARCHICAL ATTENTION: WHAT REALLY COUNTS
    • 2018
    Hierarchical Attention: What Really Counts in Various NLP Tasks
    Query Attention GloVe GloVe CNN Attention Flow Layer Modeling Layer Output Layer
    Multi-Granularity Hierarchical Attention Fusion Networks for Reading Comprehension and Question Answering
    • 91
    • Highly Influenced
    • Open Access
    Question Answering Using Hierarchical Attention on Top of BERT Features
    • 2
    • Highly Influenced
    • Open Access

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 39 REFERENCES
    Gated-Attention Readers for Text Comprehension
    • 281
    • Open Access
    Attention-over-Attention Neural Networks for Reading Comprehension
    • 273
    • Open Access
    Text Understanding with the Attention Sum Reader Network
    • 240
    • Open Access
    Iterative Alternating Neural Attention for Machine Reading
    • 89
    • Open Access
    Machine Comprehension Using Match-LSTM and Answer Pointer
    • 397
    • Open Access
    Dynamic Memory Networks for Visual and Textual Question Answering
    • 528
    • Open Access
    Stacked Attention Networks for Image Question Answering
    • 1,046
    • Open Access
    Dynamic Coattention Networks For Question Answering
    • 457
    • Open Access