Corpus ID: 4842909

QANet: Combining Local Convolution with Global Self-Attention for Reading Comprehension

@article{Yu2018QANetCL,
  title={QANet: Combining Local Convolution with Global Self-Attention for Reading Comprehension},
  author={Adams Wei Yu and David Dohan and Minh-Thang Luong and R. Zhao and Kai Chen and Mohammad Norouzi and Quoc V. Le},
  journal={ArXiv},
  year={2018},
  volume={abs/1804.09541}
}
Current end-to-end machine reading and question answering (Q\&A) models are primarily based on recurrent neural networks (RNNs) with attention. [...] Key Method The speed-up gain allows us to train the model with much more data. We hence combine our model with data generated by backtranslation from a neural machine translation model. On the SQuAD dataset, our single model, trained with augmented data, achieves 84.6 F1 score on the test set, which is significantly better than the best published F1 score of 81.8.Expand
545 Citations
Question Answering with Gated Attention and Multitask Learning-Option 3 (Graded)
  • Highly Influenced
  • PDF
Transformer with CSAN for Question Answering : SQuAD 2 . 0
  • 2019
  • PDF
Beyond RNNs: Positional Self-Attention with Co-Attention for Video Question Answering
  • 66
  • PDF
Positioning Self in SQuAD
  • Highly Influenced
  • PDF
Question Answering with Self-Attention and Residuals
  • Highly Influenced
  • PDF
Gaussian Transformer: A Lightweight Approach for Natural Language Inference
  • 24
  • Highly Influenced
QANet : Convolutions and Attention are All you Need
  • Highly Influenced
  • PDF
R-Trans: RNN Transformer Network for Chinese Machine Reading Comprehension
  • 15
  • Highly Influenced
  • PDF
BERT for Question Answering on SQuAD 2 . 0
  • 3
  • Highly Influenced
  • PDF
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 53 REFERENCES
Attention is All you Need
  • 18,205
  • Highly Influential
  • PDF
Effective Approaches to Attention-based Neural Machine Translation
  • 4,771
  • PDF
DiSAN: Directional Self-Attention Network for RNN/CNN-free Language Understanding
  • 402
  • PDF
Gated Self-Matching Networks for Reading Comprehension and Question Answering
  • 487
  • PDF
Globally Normalized Reader
  • 16
  • Highly Influential
  • PDF
Machine Comprehension Using Match-LSTM and Answer Pointer
  • 437
  • PDF
Learning to Skim Text
  • 86
  • PDF
...
1
2
3
4
5
...