Knowledge Graph Refinement based on Triplet BERT-Networks

@article{Nassiri2022KnowledgeGR,
  title={Knowledge Graph Refinement based on Triplet BERT-Networks},
  author={Armita Khajeh Nassiri and Nathalie Pernelle and Fatiha Sa{\"i}s and Gianluca Quercini},
  journal={ArXiv},
  year={2022},
  volume={abs/2211.10460}
}
. Knowledge graph embedding techniques are widely used for knowledge graph refinement tasks such as graph completion and triple classification. These techniques aim at embedding the entities and relations of a Knowledge Graph (KG) in a low dimensional continuous feature space. This paper adopts a transformer-based triplet network creating an embedding space that clusters the information about an entity or relation in the KG. It creates textual sequences from facts and fine-tunes a triplet network… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 19 REFERENCES

KG-BERT: BERT for Knowledge Graph Completion

This work treats triples in knowledge graphs as textual sequences and proposes a novel framework named Knowledge Graph Bidirectional Encoder Representations from Transformer (KG-BERT) to model these triples.

Representation Learning of Knowledge Graphs with Entity Descriptions

Experimental results on real-world datasets show that, the proposed novel RL method for knowledge graphs outperforms other baselines on the two tasks, especially under the zero-shot setting, which indicates that the method is capable of building representations for novel entities according to their descriptions.

DOLORES: Deep Contextualized Knowledge Graph Embeddings

This work introduces a new method DOLORES for learning knowledge graph embeddings that effectively captures contextual cues and dependencies among entities and relations and shows that these representations can very easily be incorporated into existing models to significantly advance the state of the art on several knowledge graph prediction tasks.

Convolutional 2D Knowledge Graph Embeddings

ConvE, a multi-layer convolutional network model for link prediction, is introduced, and it is found that ConvE achieves state-of-the-art Mean Reciprocal Rank across all datasets.

ProjE: Embedding Projection for Knowledge Graph Completion

This work presents a shared variable neural network model called ProjE that fills-in missing information in a knowledge graph by learning joint embeddings of the knowledge graph’s entities and edges, and through subtle, but important, changes to the standard loss function.

Representation Learning of Knowledge Graphs with Hierarchical Types

Experimental results show that the proposed Type-embodied Knowledge Representation Learning models significantly outperform all baselines on both tasks, especially with long-tail distribution, and indicates that the models are capable of capturing hierarchical type information which is significant when constructing representations of knowledge graphs.

Injecting Background Knowledge into Embedding Models for Predictive Tasks on Knowledge Graphs

Methods for injecting available background knowledge (schema axioms) to further improve the quality of the embeddings are proposed and implemented in new releases of the authors' systems.

RotatE: Knowledge Graph Embedding by Relational Rotation in Complex Space

Experimental results show that the proposed RotatE model is not only scalable, but also able to infer and model various relation patterns and significantly outperform existing state-of-the-art models for link prediction.

Modeling Relation Paths for Representation Learning of Knowledge Bases

This model considers relation paths as translations between entities for representation learning, and addresses two key challenges: (1) Since not all relation paths are reliable, it design a path-constraint resource allocation algorithm to measure the reliability of relation paths and (2) represents relation paths via semantic composition of relation embeddings.

Embedding Entities and Relations for Learning and Inference in Knowledge Bases

It is found that embeddings learned from the bilinear objective are particularly good at capturing relational semantics and that the composition of relations is characterized by matrix multiplication.