• Corpus ID: 238408158

EntQA: Entity Linking as Question Answering

  title={EntQA: Entity Linking as Question Answering},
  author={Wenzheng Zhang and Wenyue Hua and Karl Stratos},
A conventional approach to entity linking is to first find mentions in a given document and then infer their underlying entities in the knowledge base. A well-known limitation of this approach is that it requires finding mentions without knowing their entities, which is unnatural and difficult. We present a new model that does not suffer from this limitation called EntQA, which stands for Entity linking as Question Answering. EntQA first proposes candidate entities with a fast retrieval module… 

Figures and Tables from this paper


Exploiting Entity Linking in Queries for Entity Retrieval
A new probabilistic component is introduced and it is shown how it can be applied on top of any term-based entity retrieval model that can be emulated in the Markov Random Field framework, including language models, sequential dependence models, as well as their fielded variations.
Joint Learning of Named Entity Recognition and Entity Linking
This paper performs joint learning of NER and EL to leverage their relatedness and obtain a more robust and generalisable system, and introduces a model inspired by the Stack-LSTM approach.
Entity Linking via Joint Encoding of Types, Descriptions, and Context
This work presents a neural, modular entity linking system that learns a unified dense representation for each entity using multiple sources of information, such as its description, contexts around its mentions, and its fine-grained types.
Investigating Entity Knowledge in BERT with Simple Neural End-To-End Entity Linking
This study proposes an extreme simplification of the entity linking setup that works surprisingly well: simply cast it as a per token classification over the entire entity vocabulary and shows on an entity linking benchmark that this model improves the entity representations over plain BERT.
End-to-End Neural Entity Linking
This work proposes the first neural end-to-end EL system that jointly discovers and links entities in a text document and shows that it significantly outperforms popular systems on the Gerbil platform when enough training data is available.
Entity-Relation Extraction as Multi-Turn Question Answering
The proposed multi-turn QA model achieves the best performance on the RESUME dataset, which requires multi-step reasoning to construct entity dependencies, as opposed to the single-step dependency extraction in the triplet exaction in previous datasets.
Learning Dense Representations for Entity Retrieval
We show that it is feasible to perform entity linking by training a dual encoder (two-tower) model that encodes mentions and entities in the same dense vector space, where candidate entities are
Design Challenges for Entity Linking
This work analyzes differences between several versions of the EL problem and presents a simple yet effective, modular, unsupervised system, called Vinculum, for entity linking, and elucidate key aspects of the system that include mention extraction, candidate generation, entity type prediction, entity coreference, and coherence.
CorefQA: Coreference Resolution as Query-based Span Prediction
CorefQA is presented, an accurate and extensible approach for the coreference resolution task, formulated as a span prediction task, like in question answering, which provides the flexibility of retrieving mentions left out at the mention proposal stage.
Robust Disambiguation of Named Entities in Text
A robust method for collective disambiguation is presented, by harnessing context from knowledge bases and using a new form of coherence graph that significantly outperforms prior methods in terms of accuracy, with robust behavior across a variety of inputs.