• Corpus ID: 202763266

Quaternion Knowledge Graph Embeddings

@inproceedings{Zhang2019QuaternionKG,
  title={Quaternion Knowledge Graph Embeddings},
  author={Shuai Zhang and Yi Tay and Lina Yao and Qi Liu},
  booktitle={Neural Information Processing Systems},
  year={2019}
}
In this work, we move beyond the traditional complex-valued representations, introducing more expressive hypercomplex representations to model entities and relations for knowledge graph embeddings. More specifically, quaternion embeddings, hypercomplex-valued embeddings with three imaginary components, are utilized to represent entities. Relations are modelled as rotations in the quaternion space. The advantages of the proposed approach are: (1) Latent inter-dependencies (between all components… 

Figures and Tables from this paper

Knowledge Graph Embeddings in Geometric Algebras

A novel geometric algebra-based KG embedding framework, GeomE, is introduced, which utilizes multivector representations and the geometric product to model entities and relations and outperforms existing state-of-the-art models for link prediction.

Learning Hierarchy-Aware Quaternion Knowledge Graph Embeddings with Representing Relations as 3D Rotations

This work proposes a new model called HRQE, which represents entities as pure quaternions and is the first model that can encode symmetry/antisymmetry, inversion, composition, multiple relation patterns and learn semantic hierarchies simultaneously.

Dual Quaternion Embeddings for Link Prediction

This work proposes a novel approach called DualQuatE that maps entities and relations into a dual quaternion space and utilizes interactions of different translations and rotations to distinguish various relations between head and tail entities.

RotateCT: Knowledge Graph Embedding by Rotation and Coordinate Transformation in Complex Space

A new knowledge graph embedding method called RotateCT is proposed, which first transforms the coordinates of each entity, and then represents each relation as a rotation from head entity to tail entity in complex space and can infer the non-commutative composition patterns and improve the computational efficiency.

Quaternion-Based Knowledge Graph Network for Recommendation

This paper proposes Quaternion-based Knowledge Graph Network (QKGN) for recommendation, which represents users and items with quaternion embeddings in hypercomplex space, so that the latent inter-dependencies between entities and relations could be captured effectively.

Translation-Based Embeddings with Octonion for Knowledge Graph Completion

A Poincaré-extended TransO model (PTransO), which transforms octonion coordinate vectors into hyperbolic embeddings by exponential mapping, and integrates the Euclidean-based calculations intohyperbolic space by operations such as Möbius addition andHyperbolic distance.

QuatDE: Dynamic Quaternion Embedding for Knowledge Graph Completion

This paper proposes a novel model, QuatDE, with a dynamic mapping strategy to explicitly capture the variety of relational patterns and separate different semantic information of the entity, using transition vectors to adjust the point position of the entities embedding vectors in the quaternion space via Hamilton product, enhancing the feature interaction capability between elements of the triplet.

QuatRE: Relation-Aware Quaternions for Knowledge Graph Embeddings

This model aims to enhance correlations between head and tail entities given a relation within the Quaternion space with Hamilton product by further associating each relation with two relation-aware rotations, which are used to rotate quaternion embeddings of the head and Tail entities, respectively.

A Quaternion-Embedded Capsule Network Model for Knowledge Graph Completion

This paper presents a novel capsule network method for link prediction taking advantages of quaternion, including a relational rotation model called QuaR and a deep capsule neural model called CapS-QuaR to encode semantics of factual triples.

Rotate3D: Representing Relations as Rotations in Three-Dimensional Space for Knowledge Graph Embedding

A new model called Rotate3D is proposed, which maps entities to the three-dimensional space and defines relations as rotations from head entities to tail entities and can naturally preserve the order of the composition of relations by using the non-commutative composition property of rotations.
...

References

SHOWING 1-10 OF 35 REFERENCES

RotatE: Knowledge Graph Embedding by Relational Rotation in Complex Space

Experimental results show that the proposed RotatE model is not only scalable, but also able to infer and model various relation patterns and significantly outperform existing state-of-the-art models for link prediction.

Knowledge Graph Embedding by Translating on Hyperplanes

This paper proposes TransH which models a relation as a hyperplane together with a translation operation on it and can well preserve the above mapping properties of relations with almost the same model complexity of TransE.

Complex and Holographic Embeddings of Knowledge Graphs: A Comparison

This short paper provides a comparison of two state-of-the-art knowledge graph embeddings for which their equivalence has recently been established, i.e., ComplEx and HolE.

Embedding Entities and Relations for Learning and Inference in Knowledge Bases

It is found that embeddings learned from the bilinear objective are particularly good at capturing relational semantics and that the composition of relations is characterized by matrix multiplication.

Complex Embeddings for Simple Link Prediction

This work makes use of complex valued embeddings to solve the link prediction problem through latent factorization, and uses the Hermitian dot product, the complex counterpart of the standard dot product between real vectors.

SimplE Embedding for Link Prediction in Knowledge Graphs

It is proved SimplE is fully expressive and derive a bound on the size of its embeddings for full expressivity and shown empirically that, despite its simplicity, SimplE outperforms several state-of-the-art tensor factorization techniques.

Knowledge Graph Embedding with Entity Neighbors and Deep Memory Network

A new kind of additional information, called entity neighbors, are proposed, which contain both semantic and topological features about given entity, and a deep memory network model is developed to encode information from neighbors.

Translating Embeddings for Modeling Multi-relational Data

TransE is proposed, a method which models relationships by interpreting them as translations operating on the low-dimensional embeddings of the entities, which proves to be powerful since extensive experiments show that TransE significantly outperforms state-of-the-art methods in link prediction on two knowledge bases.

Learning Entity and Relation Embeddings for Knowledge Graph Completion

TransR is proposed to build entity and relation embeddings in separate entity space and relation spaces by first projecting entities from entity space to corresponding relation space and then building translations between projected entities.

Convolutional 2D Knowledge Graph Embeddings

ConvE, a multi-layer convolutional network model for link prediction, is introduced, and it is found that ConvE achieves state-of-the-art Mean Reciprocal Rank across all datasets.