Dynamic Coattention Networks For Question Answering

@article{Xiong2016DynamicCN,
  title={Dynamic Coattention Networks For Question Answering},
  author={Caiming Xiong and Victor Zhong and Richard Socher},
  journal={ArXiv},
  year={2016},
  volume={abs/1611.01604}
}
Several deep learning models have been proposed for question answering. How- ever, due to their single-pass nature, they have no way to recover from local maxima corresponding to incorrect answers. To address this problem, we introduce the Dynamic Coattention Network (DCN) for question answering. The DCN first fuses co-dependent representations of the question and the document in order to focus on relevant parts of both. Then a dynamic pointer decoder iterates over potential answer spans. This… CONTINUE READING

Figures, Tables, Results, and Topics from this paper.

Key Quantitative Results

  • On the Stanford question answering dataset, a single DCN model improves the previous state of the art from 71.0% F1 to 75.9%, while a DCN ensemble obtains 80.4% F1.None Several deep learning models have been proposed for question answering.
  • On the Stanford question answering dataset, a single DCN model improves the previous state of the art from 71.0% F1 to 75.9%, while a DCN ensemble obtains 80.4% F1.

Similar Papers

Citations

Publications citing this paper.
SHOWING 1-10 OF 286 CITATIONS

Open-Domain Question Answering using Feature Encoded Dynamic Coattention Networks

  • 2018 International Conference on Advances in Computing, Communications and Informatics (ICACCI)
  • 2018
VIEW 10 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Pay More Attention - Neural Architectures for Question-Answering

VIEW 8 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

An End-to-End Neural Architecture for Reading Comprehension

Meredith Burkle, M. Camacho
  • 2017
VIEW 8 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

FUSIONNET: FUSING VIA FULLY-AWARE ATTENTION

VIEW 7 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Filter-Context Dynamic Coattention Networks for Question Answering

VIEW 10 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

SQuAD Reading Comprehension with Coattention

VIEW 6 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Machine Reading Comprehension: a Literature Review

Xin Zhang, An Yang, Sujian Li, Yizhong Wang
  • ArXiv
  • 2019
VIEW 8 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Neural Machine Reading Comprehension: Methods and Trends

VIEW 8 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

FILTER CITATIONS BY YEAR

2016
2019

CITATION STATISTICS

  • 67 Highly Influenced Citations

  • Averaged 94 Citations per year from 2017 through 2019

References

Publications referenced by this paper.