• Corpus ID: 202763266

Quaternion Knowledge Graph Embeddings

  title={Quaternion Knowledge Graph Embeddings},
  author={Shuai Zhang and Yi Tay and Lina Yao and Qi Liu},
  booktitle={Neural Information Processing Systems},
In this work, we move beyond the traditional complex-valued representations, introducing more expressive hypercomplex representations to model entities and relations for knowledge graph embeddings. More specifically, quaternion embeddings, hypercomplex-valued embeddings with three imaginary components, are utilized to represent entities. Relations are modelled as rotations in the quaternion space. The advantages of the proposed approach are: (1) Latent inter-dependencies (between all components… 

Figures and Tables from this paper

Dual Quaternion Knowledge Graph Embeddings

The core of DualE lies a specific design of dual-quaternion-based multiplication, which universally models relations as the compositions of a series of translation and rotation operations.

Knowledge Graph Embeddings in Geometric Algebras

A novel geometric algebra-based KG embedding framework, GeomE, is introduced, which utilizes multivector representations and the geometric product to model entities and relations and outperforms existing state-of-the-art models for link prediction.

Learning Hierarchy-Aware Quaternion Knowledge Graph Embeddings with Representing Relations as 3D Rotations

This work proposes a new model called HRQE, which represents entities as pure quaternions and is the first model that can encode symmetry/antisymmetry, inversion, composition, multiple relation patterns and learn semantic hierarchies simultaneously.

Dual Quaternion Embeddings for Link Prediction

This work proposes a novel approach called DualQuatE that maps entities and relations into a dual quaternion space and utilizes interactions of different translations and rotations to distinguish various relations between head and tail entities.

Quaternion-Based Knowledge Graph Network for Recommendation

This paper proposes Quaternion-based Knowledge Graph Network (QKGN) for recommendation, which represents users and items with quaternion embeddings in hypercomplex space, so that the latent inter-dependencies between entities and relations could be captured effectively.

Translation-Based Embeddings with Octonion for Knowledge Graph Completion

A Poincaré-extended TransO model (PTransO), which transforms octonion coordinate vectors into hyperbolic embeddings by exponential mapping, and integrates the Euclidean-based calculations intohyperbolic space by operations such as Möbius addition andHyperbolic distance.

Dynamic dual quaternion knowledge graph embedding

This paper proposes a novel knowledge graph embedding model called DualDE, which dynamically maps the dual quaternions to the knowledge graph, and uses a dynamic mapping mechanism to construct the entity transition vector and the relation transition vector.

QuatDE: Dynamic Quaternion Embedding for Knowledge Graph Completion

This paper proposes a novel model, QuatDE, with a dynamic mapping strategy to explicitly capture the variety of relational patterns and separate different semantic information of the entity, using transition vectors to adjust the point position of the entities embedding vectors in the quaternion space via Hamilton product, enhancing the feature interaction capability between elements of the triplet.

QuatRE: Relation-Aware Quaternions for Knowledge Graph Embeddings

This model aims to enhance correlations between head and tail entities given a relation within the Quaternion space with Hamilton product by further associating each relation with two relation-aware rotations, which are used to rotate quaternion embeddings of the head and Tail entities, respectively.

A Quaternion-Embedded Capsule Network Model for Knowledge Graph Completion

This paper presents a novel capsule network method for link prediction taking advantages of quaternion, including a relational rotation model called QuaR and a deep capsule neural model called CapS-QuaR to encode semantics of factual triples.



RotatE: Knowledge Graph Embedding by Relational Rotation in Complex Space

Experimental results show that the proposed RotatE model is not only scalable, but also able to infer and model various relation patterns and significantly outperform existing state-of-the-art models for link prediction.

TorusE: Knowledge Graph Embedding on a Lie Group

A novel embedding model, TorusE, is proposed that outperforms other state-of-the-art approaches such as TransE, DistMult and ComplEx on a standard link prediction task and is scalable to large-size knowledge graphs and is faster than the original TransE.

Knowledge Graph Embedding by Translating on Hyperplanes

This paper proposes TransH which models a relation as a hyperplane together with a translation operation on it and can well preserve the above mapping properties of relations with almost the same model complexity of TransE.

Complex and Holographic Embeddings of Knowledge Graphs: A Comparison

This short paper provides a comparison of two state-of-the-art knowledge graph embeddings for which their equivalence has recently been established, i.e., ComplEx and HolE.

Embedding Entities and Relations for Learning and Inference in Knowledge Bases

It is found that embeddings learned from the bilinear objective are particularly good at capturing relational semantics and that the composition of relations is characterized by matrix multiplication.

Holographic Embeddings of Knowledge Graphs

Holographic embeddings (HolE) are proposed to learn compositional vector space representations of entire knowledge graphs to outperform state-of-the-art methods for link prediction on knowledge graphs and relational learning benchmark datasets.

SimplE Embedding for Link Prediction in Knowledge Graphs

It is proved SimplE is fully expressive and derive a bound on the size of its embeddings for full expressivity and shown empirically that, despite its simplicity, SimplE outperforms several state-of-the-art tensor factorization techniques.

Knowledge Graph Embedding with Entity Neighbors and Deep Memory Network

A new kind of additional information, called entity neighbors, are proposed, which contain both semantic and topological features about given entity, and a deep memory network model is developed to encode information from neighbors.

Translating Embeddings for Modeling Multi-relational Data

TransE is proposed, a method which models relationships by interpreting them as translations operating on the low-dimensional embeddings of the entities, which proves to be powerful since extensive experiments show that TransE significantly outperforms state-of-the-art methods in link prediction on two knowledge bases.

Learning Entity and Relation Embeddings for Knowledge Graph Completion

TransR is proposed to build entity and relation embeddings in separate entity space and relation spaces by first projecting entities from entity space to corresponding relation space and then building translations between projected entities.