BERT for Coreference Resolution: Baselines and Analysis

@article{Joshi2019BERTFC,
  title={BERT for Coreference Resolution: Baselines and Analysis},
  author={Mandar Joshi and Omer Levy and Daniel S. Weld and Luke Zettlemoyer},
  journal={ArXiv},
  year={2019},
  volume={abs/1908.09091}
}
  • Mandar Joshi, Omer Levy, +1 author Luke Zettlemoyer
  • Published in IJCNLP 2019
We apply BERT to coreference resolution, achieving strong improvements on the OntoNotes (+3.9 F1) and GAP (+11.5 F1) benchmarks. A qualitative analysis of model predictions indicates that, compared to ELMo and BERT-base, BERT-large is particularly better at distinguishing between related but distinct entities (e.g., President and CEO). However, there is still room for improvement in modeling document-level context, conversations, and mention paraphrasing. Our code and models are publicly… CONTINUE READING

Tables and Topics from this paper.

Explore Further: Topics Discussed in This Paper