Convolutional Complex Knowledge Graph Embeddings

@inproceedings{Demir2020ConvolutionalCK,
  title={Convolutional Complex Knowledge Graph Embeddings},
  author={Caglar Demir and Axel-Cyrille Ngonga Ngomo},
  booktitle={Extended Semantic Web Conference},
  year={2020}
}
In this paper, we study the problem of learning continuous vector representations of knowledge graphs for predicting missing links. We present a new approach called ConEx, which infers missing links by leveraging the composition of a 2D convolution with a Hermitian inner product of complex-valued embedding vectors. We evaluate ConEx against state-of-the-art approaches on the WN18RR, FB15K-237, KINSHIP and UMLS benchmark datasets. Our experimental results show that ConEx achieves a performance… 

Trustworthy Knowledge Graph Completion Based on Multi-sourced Noisy Data

A new trustworthy method that exploits facts for a KG based on multi-sourced noisy data and existing facts in the KG, and introduces a graph neural network with a holistic scoring function to judge the plausibility of facts with various value types.

HybridFC: A Hybrid Fact-Checking Approach for Knowledge Graphs

This work proposes a hybrid approach—dubbed HybridFC—that exploits the diversity of existing categories of fact-checking approaches within an ensemble learning setting to achieve a significantly better prediction performance.

Kronecker Decomposition for Knowledge Graph Embeddings

A technique based on Kronecker decomposition is proposed to reduce the number of parameters in a knowledge graph embedding model, while retaining its expressiveness, and empirical evidence suggests that reconstructed embeddings entail robustness against noise in the input knowledge graph.

GATES: Using Graph Attention Networks for Entity Summarization

GATES is proposed, a new entity summarization approach that combines topological information and knowledge graph embeddings to encode triples and is encoded by means of a Graph Attention Network and ensemble learning is applied to boost the performance of triple scoring.

Out-of-Vocabulary Entities in Link Prediction

This work spotted a further common limitation of three of the benchmarks commonly used for evaluating link prediction approaches: out-of-vocabulary entities in the test and validation sets and provides an implementation of an approach for spotting and removing such entities and corrected versions of the datasets WN18RR, FB15K-237, and YAGO310.

QuatCNNEx: A Knowledge Graph Embedding Model based on Quaternion Convolutional Neural Network

The combination of quaternion and convolution neural network is explored to construct QuatCNNEx which is a new Knowledge Graph Embedding model that can model the composition relations and predict the missing triples in the Knowledge Graph using the interactive feature information of entities and relations.

Reasoning over Different Types of Knowledge Graphs: Static, Temporal and Multi-Modal

A survey for knowledge graph reasoning tracing from static to temporal and then to multi-modal KGs, and the preliminaries, summaries of KGR models, and typical datasets are introduced and discussed consequently.

A Review of Knowledge Graph Completion

The task of knowledge graph completion is divided into conventional and graph neural network representation learning and graphs in more detail, which aims to predict unknown triples based on previously visited triples.

I Know What You Do Not Know: Knowledge Graph Embedding via Co-distillation Learning

CoLE is proposed, a Co-distillation Learning method for KG Embedding that exploits the complementarity of graph structures and text information and advances the state-of-the-art of KG embedding.

References

SHOWING 1-10 OF 66 REFERENCES

Convolutional 2D Knowledge Graph Embeddings

ConvE, a multi-layer convolutional network model for link prediction, is introduced, and it is found that ConvE achieves state-of-the-art Mean Reciprocal Rank across all datasets.

You CAN Teach an Old Dog New Tricks! On Training Knowledge Graph Embeddings

It is found that when trained appropriately, the relative performance differences between various model architectures often shrinks and sometimes even reverses when compared to prior results, and many of the more advanced architectures and techniques proposed in the literature should be revisited to reassess their individual benefits.

A Survey on Knowledge Graphs: Representation, Acquisition, and Applications

A comprehensive review of the knowledge graph covering overall research topics about: 1) knowledge graph representation learning; 2) knowledge acquisition and completion; 3) temporal knowledge graph; and 4) knowledge-aware applications and summarize recent breakthroughs and perspective directions to facilitate future research.

On Understanding Knowledge Graph Representation

This work builds on recent theoretical interpretation of word embeddings as a basis to consider an explicit structure for representations of relations between entities to predict properties and justify the relative performance of leading knowledge graph representation methods.

RotatE: Knowledge Graph Embedding by Relational Rotation in Complex Space

Experimental results show that the proposed RotatE model is not only scalable, but also able to infer and model various relation patterns and significantly outperform existing state-of-the-art models for link prediction.

TuckER: Tensor Factorization for Knowledge Graph Completion

This work proposes TuckER, a relatively straightforward but powerful linear model based on Tucker decomposition of the binary tensor representation of knowledge graph triples that outperforms previous state-of-the-art models across standard link prediction datasets, acting as a strong baseline for more elaborate models.

Go for a Walk and Arrive at the Answer: Reasoning Over Paths in Knowledge Bases using Reinforcement Learning

A new algorithm MINERVA is proposed, which addresses the much more difficult and practical task of answering questions where the relation is known, but only one entity, and significantly outperforms prior methods.

Modeling Relational Data with Graph Convolutional Networks

It is shown that factorization models for link prediction such as DistMult can be significantly improved through the use of an R-GCN encoder model to accumulate evidence over multiple inference steps in the graph, demonstrating a large improvement of 29.8% on FB15k-237 over a decoder-only baseline.

Very Deep Convolutional Networks for Large-Scale Image Recognition

This work investigates the effect of the convolutional network depth on its accuracy in the large-scale image recognition setting using an architecture with very small convolution filters, which shows that a significant improvement on the prior-art configurations can be achieved by pushing the depth to 16-19 weight layers.

MDE: Multi Distance Embeddings for Link Prediction in Knowledge Graphs

This work proposes the Multiple Distance Embedding model (MDE) as a neural network model that allows us to map non-linear relations between the embedding vectors and the expected output of the score function and demonstrates that MDE performs competitively to state-of-the-art embedding models on several benchmark datasets.
...