• Computer Science
  • Published in ICLR 2016

Bidirectional Attention Flow for Machine Comprehension

@article{Seo2016BidirectionalAF,
  title={Bidirectional Attention Flow for Machine Comprehension},
  author={Minjoon Seo and Aniruddha Kembhavi and Ali Farhadi and Hannaneh Hajishirzi},
  journal={ArXiv},
  year={2016},
  volume={abs/1611.01603}
}
Machine comprehension (MC), answering a query about a given context paragraph, requires modeling complex interactions between the context and the query. [...] Key Method In this paper we introduce the Bi-Directional Attention Flow (BIDAF) network, a multi-stage hierarchical process that represents the context at different levels of granularity and uses bi-directional attention flow mechanism to obtain a query-aware context representation without early summarization. Our experimental evaluations show that our…Expand Abstract

Figures, Tables, and Topics from this paper.

Citations

Publications citing this paper.
SHOWING 1-10 OF 845 CITATIONS

Multi-Passage Machine Reading Comprehension with Cross-Passage Answer Verification

VIEW 7 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

RNN on Machine Reading Comprehension Bi-Directional Attention Flow model

VIEW 7 EXCERPTS
CITES METHODS
HIGHLY INFLUENCED

Answering Event-Related Questions over Long-term News Article Archives

VIEW 12 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Video Alignment Using Bi-Directional Attention Flow in a Multi-Stage Learning Model

VIEW 5 EXCERPTS
CITES METHODS
HIGHLY INFLUENCED

An Empirical Evaluation on Word Embeddings Across Reading Comprehension

VIEW 12 EXCERPTS
CITES BACKGROUND
HIGHLY INFLUENCED

Answering Science Exam Questions Using Query Reformulation with Background Knowledge

VIEW 5 EXCERPTS
CITES RESULTS & METHODS
HIGHLY INFLUENCED

Augmenting Neural Networks with First-order Logic

VIEW 6 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

BertNet : Combining BERT language representation with Attention and CNN for Reading Comprehension

  • Girish Limaye
  • 2019
VIEW 5 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

FILTER CITATIONS BY YEAR

2014
2020

CITATION STATISTICS

  • 264 Highly Influenced Citations

  • Averaged 274 Citations per year from 2017 through 2019

  • 61% Increase in citations per year in 2019 over 2018

References

Publications referenced by this paper.
SHOWING 1-10 OF 30 REFERENCES

Long Short-Term Memory

VIEW 11 EXCERPTS
HIGHLY INFLUENTIAL

Teaching Machines to Read and Comprehend

VIEW 6 EXCERPTS
HIGHLY INFLUENTIAL

ADADELTA: An Adaptive Learning Rate Method

VIEW 2 EXCERPTS
HIGHLY INFLUENTIAL