Investigating Entity Knowledge in BERT with Simple Neural End-To-End Entity Linking

@article{Broscheit2019InvestigatingEK,
  title={Investigating Entity Knowledge in BERT with Simple Neural End-To-End Entity Linking},
  author={Samuel Broscheit},
  journal={ArXiv},
  year={2019},
  volume={abs/2003.05473}
}
A typical architecture for end-to-end entity linking systems consists of three steps: mention detection, candidate generation and entity disambiguation. In this study we investigate the following questions: (a) Can all those steps be learned jointly with a model for contextualized text-representations, i.e. BERT? (b) How much entity knowledge is already contained in pretrained BERT? (c) Does additional entity knowledge improve BERT’s performance in downstream tasks? To this end we propose an… Expand
Linking Entities to Unseen Knowledge Bases with Arbitrary Schemas
What more can Entity Linking do for Question Answering?
E-BERT: Efficient-Yet-Effective Entity Embeddings for BERT
PEL-BERT: A Joint Model for Protocol Entity Linking
Chinese Short Text Entity Disambiguation Based on the Dual-Channel Hybrid Network
...
1
2
3
...

References

SHOWING 1-10 OF 26 REFERENCES
Joint Learning of the Embedding of Words and Entities for Named Entity Disambiguation
Evaluating Entity Linking with Wikipedia
Improving Candidate Generation for Entity Linking
ERNIE: Enhanced Language Representation with Informative Entities
Learning Distributed Representations of Texts and Entities from Knowledge Base
Robust Disambiguation of Named Entities in Text
A Joint Model for Entity Analysis: Coreference, Typing, and Linking
  • Greg Durrett, D. Klein
  • Computer Science
  • Transactions of the Association for Computational Linguistics
  • 2014
J-NERD: Joint Named Entity Recognition and Disambiguation with Rich Linguistic Features
BERT Rediscovers the Classical NLP Pipeline
...
1
2
3
...