Enhancing Pre-Trained Language Representations with Rich Knowledge for Machine Reading Comprehension

@inproceedings{Yang2019EnhancingPL,
  title={Enhancing Pre-Trained Language Representations with Rich Knowledge for Machine Reading Comprehension},
  author={An Yang and Q. Wang and Jing Liu and Kai Liu and Yajuan Lyu and H. Wu and Qiaoqiao She and Sujian Li},
  booktitle={ACL},
  year={2019}
}
  • An Yang, Q. Wang, +5 authors Sujian Li
  • Published in ACL 2019
  • Computer Science
  • Machine reading comprehension (MRC) is a crucial and challenging task in NLP. [...] Key Method We introduce KT-NET, which employs an attention mechanism to adaptively select desired knowledge from KBs, and then fuses selected knowledge with BERT to enable contextand knowledgeaware predictions. We believe this would combine the merits of both deep LMs and curated KBs towards better MRC. Experimental results indicate that KT-NET offers significant and consistent improvements over BERT, outperforming competitive…Expand Abstract
    34 Citations

    Figures, Tables, and Topics from this paper.

    StructBERT: Incorporating Language Structures into Pre-training for Deep Language Understanding
    • 40
    • Highly Influenced
    • PDF
    Towards Medical Machine Reading Comprehension with Structural Knowledge and Plain Text
    STRUCTBERT: INCORPORATING LANGUAGE STRUC-
    • TURES INTO
    • 2019
    SegaBERT: Pre-training of Segment-aware BERT for Language Understanding
    Contextualized Representations Using Textual Encyclopedic Knowledge
    • 3
    • Highly Influenced
    • PDF
    A Knowledge-Enhanced Pretraining Model for Commonsense Story Generation
    • 20
    • Highly Influenced
    • PDF
    Improving Machine Reading Comprehension with Contextualized Commonsense Knowledge

    References

    SHOWING 1-10 OF 41 REFERENCES
    BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
    • 11,729
    • Highly Influential
    • PDF
    QANet: Combining Local Convolution with Global Self-Attention for Reading Comprehension
    • 468
    • Highly Influential
    • PDF
    Improving Language Understanding by Generative Pre-Training
    • 1,531
    • Highly Influential
    • PDF
    Commonsense for Generative Multi-Hop Question Answering Tasks
    • 66
    • Highly Influential
    • PDF
    Machine Comprehension Using Match-LSTM and Answer Pointer
    • 409
    • PDF
    A Thorough Examination of the CNN/Daily Mail Reading Comprehension Task
    • 412
    • PDF
    Dynamic Integration of Background Knowledge in Neural NLU Systems
    • 44
    • PDF
    Teaching Machines to Read and Comprehend
    • 1,713
    • PDF