Enhancing Pre-Trained Language Representations with Rich Knowledge for Machine Reading Comprehension

@inproceedings{Yang2019EnhancingPL,
  title={Enhancing Pre-Trained Language Representations with Rich Knowledge for Machine Reading Comprehension},
  author={An Yang and Qi-zhou Wang and Jing Liu and Kai Liu and Yajuan Lyu and Hua Wu and Qiaoqiao She and Sujian Li},
  booktitle={ACL},
  year={2019}
}
Machine reading comprehension (MRC) is a crucial and challenging task in NLP. Recently, pre-trained language models (LMs), especially BERT, have achieved remarkable success, presenting new state-of-the-art results in MRC. In this work, we investigate the potential of leveraging external knowledge bases (KBs) to further improve BERT for MRC. We introduce KT-NET, which employs an attention mechanism to adaptively select desired knowledge from KBs, and then fuses selected knowledge with BERT to… CONTINUE READING

References

Publications referenced by this paper.
SHOWING 1-10 OF 40 REFERENCES