Fine-Grained Evaluation of Rule- and Embedding-Based Systems for Knowledge Graph Completion

@inproceedings{Meilicke2018FineGrainedEO,
  title={Fine-Grained Evaluation of Rule- and Embedding-Based Systems for Knowledge Graph Completion},
  author={Christian Meilicke and Manuel Fink and Yanjie Wang and Daniel Ruffinelli and Rainer Gemulla and Heiner Stuckenschmidt},
  booktitle={SEMWEB},
  year={2018}
}
Over the recent years, embedding methods have attracted increasing focus as a means for knowledge graph completion. Similarly, rule-based systems have been studied for this task in the past. What is missing so far is a common evaluation that includes more than one type of method. We close this gap by comparing representatives of both types of systems in a frequently used evaluation protocol. Leveraging the explanatory qualities of rule-based systems, we present a fine-grained evaluation that… 
Realistic Re-evaluation of Knowledge Graph Completion Methods: An Experimental Study
TLDR
This paper is the first systematic study with the main objective of assessing the true effectiveness of embedding models when the unrealistic triples are removed, and results show their poor accuracy renders link prediction a task without truly effective automated solution.
Knowledge Graph Embedding for Link Prediction: A Comparative Analysis
TLDR
A comprehensive comparison of embedding-based LP methods is provided, extending the dimensions of analysis beyond what is commonly available in the literature, and experimentally compare effectiveness and efficiency of 16 state-of-the-art methods.
TransRESCAL: A Dense Feature Model for Knowledge Graph Completion
TLDR
The proposed model is capable of clarifying the fact that the position of head and tail entity embeddings in score function of factorization-based models, and is not only capable of capturing structural feature by TransE but also semantic ones by RESCAL.
An Introduction to AnyBURL
TLDR
This paper proposes a bottom-up technique for efficiently learning logical rules from large knowledge graphs inspired by classic bottom- up rule learning approaches as Golem and Aleph, and performs as good as and sometimes better than most models that have been proposed recently.
On Evaluating Embedding Models for Knowledge Base Completion
TLDR
This paper argues that the entity ranking protocol, which is currently used to evaluate knowledge graph embedding models, is not suitable to answer this question since only a subset of the model predictions are evaluated, and proposes an alternative entity-pair ranking protocol that considers all model predictions as a whole and is thus more suitable to the task.
Why a Naive Way to Combine Symbolic and Latent Knowledge Base Completion Works Surprisingly Well
TLDR
A simple method to combine the outcome of rule-based and latent approaches in a post-processing step is constructed and improves the results constantly for each model and dataset used in the authors' experiments.
Application of concepts of neighbours to knowledge graph completion1
TLDR
This work proposes a new approach that does not need a training phase, and that can provide interpretable explanations for each inference, that relies on the computation of Concepts of Nearest Neighbours to identify clusters of similar entities based on common graph patterns.
Simplified Representation Learning Model Based on Parameter-Sharing for Knowledge Graph Completion
TLDR
This paper investigates how to enhance the simplicity of KGC model and achieve a reasonable balance between accuracy and complexity and shows that the proposed framework improves the performance of the current represent learning models for KGC task.
Revisiting the Evaluation Protocol of Knowledge Graph Completion Methods for Link Prediction
TLDR
This paper contributes to the evaluation of link prediction by proposing a variation of the mean rank that considers the number of negative counterparts and defining the anomaly coefficient of a predicate and integrating such coefficient in the protocol.
Efficient Relation-aware Scoring Function Search for Knowledge Graph Embedding
TLDR
This paper proposes to encode the space as a supernet and proposes an efficient alternative minimization algorithm to search through the supernet in a one-shot manner, and demonstrates that the proposed method can efficiently search relation-aware scoring functions, and achieve better embedding performance than state-of-the-art methods.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 20 REFERENCES
ProjE: Embedding Projection for Knowledge Graph Completion
TLDR
This work presents a shared variable neural network model called ProjE that fills-in missing information in a knowledge graph by learning joint embeddings of the knowledge graph's entities and edges, and through subtle, but important, changes to the standard loss function.
Efficient and Expressive Knowledge Base Completion Using Subgraph Feature Extraction
TLDR
It is shown that the random walk probabilities computed by PRA provide no discernible benefit to performance on this task, so they can safely be dropped, and this allows a simpler algorithm for generating feature matrices from graphs, which is called subgraph feature extraction (SFE).
Modeling Relational Data with Graph Convolutional Networks
TLDR
It is shown that factorization models for link prediction such as DistMult can be significantly improved through the use of an R-GCN encoder model to accumulate evidence over multiple inference steps in the graph, demonstrating a large improvement of 29.8% on FB15k-237 over a decoder-only baseline.
Knowledge Graph Embedding by Translating on Hyperplanes
TLDR
This paper proposes TransH which models a relation as a hyperplane together with a translation operation on it and can well preserve the above mapping properties of relations with almost the same model complexity of TransE.
TransG : A Generative Model for Knowledge Graph Embedding
TLDR
This paper proposes a novel generative model (TransG) to address the issue of multiple relation semantics that a relation may have multiple meanings revealed by the entity pairs associated with the corresponding triples.
Embedding Entities and Relations for Learning and Inference in Knowledge Bases
TLDR
It is found that embeddings learned from the bilinear objective are particularly good at capturing relational semantics and that the composition of relations is characterized by matrix multiplication.
Knowledge Base Completion: Baselines Strike Back
TLDR
It is shown that the accuracy of almost all models published on the FB15k can be outperformed by an appropriately tuned baseline — the authors' reimplementation of the DistMult model.
Convolutional 2D Knowledge Graph Embeddings
TLDR
ConvE, a multi-layer convolutional network model for link prediction, is introduced and it is found that ConvE achieves state-of-the-art Mean Reciprocal Rank across most datasets.
Translating Embeddings for Modeling Multi-relational Data
TLDR
TransE is proposed, a method which models relationships by interpreting them as translations operating on the low-dimensional embeddings of the entities, which proves to be powerful since extensive experiments show that TransE significantly outperforms state-of-the-art methods in link prediction on two knowledge bases.
...
1
2
...