Knowledge Graph Embedding Bi-Vector Models for Symmetric Relation

  title={Knowledge Graph Embedding Bi-Vector Models for Symmetric Relation},
  author={Jinkui Yao and Liang Xu},
Knowledge graph embedding (KGE) models have been proposed to improve the performance of knowledge graph reasoning. However, there is a general phenomenon in most of KGEs, as the training progresses, the symmetric relations tend to zero vector, if the symmetric triples ratio is high enough in the dataset. This phenomenon causes subsequent tasks, e.g. link prediction etc., of symmetric relations to fail. The root cause of the problem is that KGEs do not utilize the semantic information of… 
1 Citations
A Survey on Knowledge Graph Embeddings for Link Prediction
A comprehensive survey on KG-embedding models for link prediction in knowledge graphs is provided and a theoretical analysis and comparison of existing methods proposed to date for generating KG embedding are investigated.


Knowledge Graph Embedding via Dynamic Mapping Matrix
A more fine-grained model named TransD, which is an improvement of TransR/CTransR, which not only considers the diversity of relations, but also entities, which makes it can be applied on large scale graphs.
Knowledge Graph Completion with Adaptive Sparse Transfer Matrix
Experimental results show that TranSparse outperforms Trans(E, H, R, and D) significantly, and achieves state-of-the-art performance on triplet classification and link prediction tasks.
Embedding Entities and Relations for Learning and Inference in Knowledge Bases
It is found that embeddings learned from the bilinear objective are particularly good at capturing relational semantics and that the composition of relations is characterized by matrix multiplication.
Translating Embeddings for Modeling Multi-relational Data
TransE is proposed, a method which models relationships by interpreting them as translations operating on the low-dimensional embeddings of the entities, which proves to be powerful since extensive experiments show that TransE significantly outperforms state-of-the-art methods in link prediction on two knowledge bases.
Holographic Embeddings of Knowledge Graphs
Holographic embeddings are proposed to learn compositional vector space representations of entire knowledge graphs to outperform state-of-the-art methods for link prediction on knowledge graphs and relational learning benchmark datasets.
Learning Entity and Relation Embeddings for Knowledge Graph Completion
TransR is proposed to build entity and relation embeddings in separate entity space and relation spaces to build translations between projected entities and to evaluate the models on three tasks including link prediction, triple classification and relational fact extraction.
Knowledge Graph Embedding by Translating on Hyperplanes
This paper proposes TransH which models a relation as a hyperplane together with a translation operation on it and can well preserve the above mapping properties of relations with almost the same model complexity of TransE.
Complex Embeddings for Simple Link Prediction
This work makes use of complex valued embeddings to solve the link prediction problem through latent factorization, and uses the Hermitian dot product, the complex counterpart of the standard dot product between real vectors.
Reasoning With Neural Tensor Networks for Knowledge Base Completion
An expressive neural tensor network suitable for reasoning over relationships between two entities given a subset of the knowledge base is introduced and performance can be improved when entities are represented as an average of their constituting word vectors.
A Three-Way Model for Collective Learning on Multi-Relational Data
This work presents a novel approach to relational learning based on the factorization of a three-way tensor that is able to perform collective learning via the latent components of the model and provide an efficient algorithm to compute the factorizations.