BERT for Coreference Resolution: Baselines and Analysis

@inproceedings{Joshi2019BERTFC,
  title={BERT for Coreference Resolution: Baselines and Analysis},
  author={Mandar Joshi and Omer Levy and Daniel S. Weld and Luke Zettlemoyer},
  booktitle={EMNLP/IJCNLP},
  year={2019}
}
  • Mandar Joshi, Omer Levy, +1 author Luke Zettlemoyer
  • Published in EMNLP/IJCNLP 2019
  • Computer Science
  • We apply BERT to coreference resolution, achieving strong improvements on the OntoNotes (+3.9 F1) and GAP (+11.5 F1) benchmarks. A qualitative analysis of model predictions indicates that, compared to ELMo and BERT-base, BERT-large is particularly better at distinguishing between related but distinct entities (e.g., President and CEO). However, there is still room for improvement in modeling document-level context, conversations, and mention paraphrasing. Our code and models are publicly… CONTINUE READING

    Tables and Topics from this paper.

    Explore Further: Topics Discussed in This Paper

    Citations

    Publications citing this paper.
    SHOWING 1-10 OF 25 CITATIONS

    Revisiting Memory-Efficient Incremental Coreference Resolution

    VIEW 13 EXCERPTS
    CITES BACKGROUND, RESULTS & METHODS
    HIGHLY INFLUENCED

    Coreferential Reasoning Learning for Language Representation

    VIEW 7 EXCERPTS
    CITES BACKGROUND & METHODS
    HIGHLY INFLUENCED

    SpanBERT: Improving Pre-training by Representing and Predicting Spans

    VIEW 1 EXCERPT
    CITES METHODS

    A Dutch coreference resolution system with an evaluation on literary fiction

    VIEW 4 EXCERPTS
    CITES BACKGROUND
    HIGHLY INFLUENCED

    Blockwise Self-Attention for Long Document Understanding

    VIEW 5 EXCERPTS
    CITES BACKGROUND & METHODS

    Towards Discourse Parsing-inspired Semantic Storytelling

    VIEW 1 EXCERPT
    CITES METHODS

    FILTER CITATIONS BY YEAR

    2019
    2020

    CITATION STATISTICS

    • 8 Highly Influenced Citations

    • Averaged 13 Citations per year from 2019 through 2020