Inducing Relational Knowledge from BERT

@article{Bouraoui2020InducingRK,
  title={Inducing Relational Knowledge from BERT},
  author={Zied Bouraoui and Jos{\'e} Camacho-Collados and S. Schockaert},
  journal={ArXiv},
  year={2020},
  volume={abs/1911.12753}
}
  • Zied Bouraoui, José Camacho-Collados, S. Schockaert
  • Published 2020
  • Computer Science
  • ArXiv
  • One of the most remarkable properties of word embeddings is the fact that they capture certain types of semantic and syntactic relationships. Recently, pre-trained language models such as BERT have achieved groundbreaking results across a wide range of Natural Language Processing tasks. However, it is unclear to what extent such models capture relational knowledge beyond what is already captured by standard word embeddings. To explore this question, we propose a methodology for distilling… CONTINUE READING
    KnowlyBERT-Hybrid Query Answering over Language Models and Knowledge Graphs
    Pre-trained Models for Natural Language Processing: A Survey
    17
    How Can We Know What Language Models Know?
    17
    Empower Entity Set Expansion via Language Model Probing
    3

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 41 REFERENCES
    Relation Induction in Word Embeddings Revisited
    10
    Relational Word Embeddings
    5
    Language Models as Knowledge Bases?
    108
    Take and Took, Gaggle and Goose, Book and Read: Evaluating the Utility of Vector Differences for Lexical Relation Learning
    87
    How Well Do Distributional Models Capture Different Types of Semantic Knowledge?
    45
    Word Embeddings, Analogies, and Machine Learning: Beyond king - man + woman = queen
    65
    Linguistic Regularities in Continuous Space Word Representations
    2466