Knowledge Enhanced Contextual Word Representations

@inproceedings{Peters2019KnowledgeEC,
  title={Knowledge Enhanced Contextual Word Representations},
  author={Matthew E. Peters and Mark Neumann and IV RobertLLogan and Roy Schwartz and Vidur Joshi and Sameer Singh and Noah A. Smith},
  booktitle={EMNLP/IJCNLP},
  year={2019}
}
Contextual word representations, typically trained on unstructured, unlabeled text, do not contain any explicit grounding to real world entities and are often unable to remember facts about those entities. [...] Key Method For each KB, we first use an integrated entity linker to retrieve relevant entity embeddings, then update contextual word representations via a form of word-to-entity attention.Expand Abstract

Citations

Publications citing this paper.
SHOWING 1-9 OF 9 CITATIONS

K-Adapter: Infusing Knowledge into Pre-Trained Models with Adapters

VIEW 9 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

KEPLER: A Unified Model for Knowledge Embedding and Pre-trained Language Representation

VIEW 11 EXCERPTS
CITES METHODS
HIGHLY INFLUENCED

How Can We Know What Language Models Know?

VIEW 2 EXCERPTS
CITES METHODS & BACKGROUND

Measuring Social Bias in Knowledge Graph Embeddings

VIEW 2 EXCERPTS
CITES BACKGROUND

References

Publications referenced by this paper.
SHOWING 1-10 OF 69 REFERENCES

WordNet: a lexical database for English

VIEW 14 EXCERPTS
HIGHLY INFLUENTIAL

ERNIE: Enhanced Language Representation with Informative Entities

VIEW 4 EXCERPTS
HIGHLY INFLUENTIAL

TuckER: Tensor Factorization for Knowledge Graph Completion

VIEW 4 EXCERPTS
HIGHLY INFLUENTIAL

End-to-End Neural Entity Linking

VIEW 5 EXCERPTS
HIGHLY INFLUENTIAL

Attention is All you Need

VIEW 6 EXCERPTS
HIGHLY INFLUENTIAL

Wikidata: a free collaborative knowledgebase

VIEW 4 EXCERPTS
HIGHLY INFLUENTIAL

Robust Disambiguation of Named Entities in Text

VIEW 7 EXCERPTS
HIGHLY INFLUENTIAL