ComDensE : Combined Dense Embedding of Relation-aware and Common Features for Knowledge Graph Completion

@article{Kim2022ComDensEC,
  title={ComDensE : Combined Dense Embedding of Relation-aware and Common Features for Knowledge Graph Completion},
  author={Min-Sung Kim and Seungjun Baek},
  journal={ArXiv},
  year={2022},
  volume={abs/2206.14925}
}
—Real-world knowledge graphs (KG) are mostly in- complete. The problem of recovering missing relations, called KG completion, has recently become an active research area. Knowl- edge graph (KG) embedding, a low-dimensional representation of entities and relations, is the crucial technique for KG completion. Convolutional neural networks in models such as ConvE, SACN, InteractE, and RGCN achieve recent successes. This paper takes a different architectural view and proposes ComDensE which… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 32 REFERENCES

Modeling Relational Data with Graph Convolutional Networks

TLDR
It is shown that factorization models for link prediction such as DistMult can be significantly improved through the use of an R-GCN encoder model to accumulate evidence over multiple inference steps in the graph, demonstrating a large improvement of 29.8% on FB15k-237 over a decoder-only baseline.

Convolutional 2D Knowledge Graph Embeddings

TLDR
ConvE, a multi-layer convolutional network model for link prediction, is introduced, and it is found that ConvE achieves state-of-the-art Mean Reciprocal Rank across all datasets.

End-to-end Structure-Aware Convolutional Networks for Knowledge Base Completion

TLDR
This work proposes a novel end-to-end Structure-Aware Convolutional Network (SACN) that takes the benefit of GCN and ConvE together, and demonstrates the effectiveness of the proposed SACN on standard FB15k-237 and WN18RR datasets.

Knowledge Graph Embedding via Dynamic Mapping Matrix

TLDR
A more fine-grained model named TransD, which is an improvement of TransR/CTransR, which not only considers the diversity of relations, but also entities, which makes it can be applied on large scale graphs.

RotatE: Knowledge Graph Embedding by Relational Rotation in Complex Space

TLDR
Experimental results show that the proposed RotatE model is not only scalable, but also able to infer and model various relation patterns and significantly outperform existing state-of-the-art models for link prediction.

Embedding Entities and Relations for Learning and Inference in Knowledge Bases

TLDR
It is found that embeddings learned from the bilinear objective are particularly good at capturing relational semantics and that the composition of relations is characterized by matrix multiplication.

Knowledge Graph Embedding by Translating on Hyperplanes

TLDR
This paper proposes TransH which models a relation as a hyperplane together with a translation operation on it and can well preserve the above mapping properties of relations with almost the same model complexity of TransE.

Holographic Embeddings of Knowledge Graphs

TLDR
Holographic embeddings (HolE) are proposed to learn compositional vector space representations of entire knowledge graphs to outperform state-of-the-art methods for link prediction on knowledge graphs and relational learning benchmark datasets.

Translating Embeddings for Modeling Multi-relational Data

TLDR
TransE is proposed, a method which models relationships by interpreting them as translations operating on the low-dimensional embeddings of the entities, which proves to be powerful since extensive experiments show that TransE significantly outperforms state-of-the-art methods in link prediction on two knowledge bases.

Reasoning With Neural Tensor Networks for Knowledge Base Completion

TLDR
An expressive neural tensor network suitable for reasoning over relationships between two entities given a subset of the knowledge base is introduced and performance can be improved when entities are represented as an average of their constituting word vectors.