Regularizing Knowledge Graph Embeddings via Equivalence and Inversion Axioms

@inproceedings{Minervini2017RegularizingKG,
  title={Regularizing Knowledge Graph Embeddings via Equivalence and Inversion Axioms},
  author={Pasquale Minervini and Luca Costabello and Emir Mu{\~n}oz and V{\'i}t Nov{\'a}cek and Pierre-Yves Vandenbussche},
  booktitle={ECML/PKDD},
  year={2017}
}
Learning embeddings of entities and relations using neural architectures is an effective method of performing statistical learning on large-scale relational data, such as knowledge graphs. In this paper, we consider the problem of regularizing the training of neural knowledge graph embeddings by leveraging external background knowledge. We propose a principled and scalable method for leveraging equivalence and inversion axioms during the learning process, by imposing a set of model-dependent… 

Embedding cardinality constraints in neural link predictors

A new regularisation approach is proposed to incorporate relation cardinality constraints to any existing neural link predictor without affecting their efficiency or scalability, and structuring the embeddings space to respect commonsense cardinality assumptions resulting in better representations.

Fantastic Knowledge Graph Embeddings and How to Find the Right Space for Them

The concept of a solution space is developed as a factor that has a direct influence on the practical performance of knowledge graph embedding models as well as their capability to infer relational patterns.

LogicENN: A Neural Based Knowledge Graphs Embedding Model with Logical Rules

It is proved that LogicENN can learn every ground truth of encoded rules in a knowledge graph, and a new neural based embedding model (LogicENN) is presented that outperforms the existing models in link prediction.

Injecting Background Knowledge into Embedding Models for Predictive Tasks on Knowledge Graphs

Methods for injecting available background knowledge (schema axioms) to further improve the quality of the embeddings are proposed and implemented in new releases of the authors' systems.

Data-Dependent Learning of Symmetric/Antisymmetric Relations for Knowledge Base Completion

This work proposes a new L1 regularizer for Complex Embeddings, which is one of the state-of-the-art embedding-based methods for KBC, and shows that the proposed method outperforms the original ComplexEmbeddings and other baseline methods on the FB15k dataset.

SimplE Embedding for Link Prediction in Knowledge Graphs

It is proved SimplE is fully expressive and derive a bound on the size of its embeddings for full expressivity and shown empirically that, despite its simplicity, SimplE outperforms several state-of-the-art tensor factorization techniques.

DOLORES: Deep Contextualized Knowledge Graph Embeddings

This work introduces a new method DOLORES for learning knowledge graph embeddings that effectively captures contextual cues and dependencies among entities and relations and shows that these representations can very easily be incorporated into existing models to significantly advance the state of the art on several knowledge graph prediction tasks.

Rule-Guided Compositional Representation Learning on Knowledge Graphs

This paper proposes a novel Rule and Path-based Joint Embedding (RPJE) scheme, which takes full advantage of the explainability and accuracy of logic rules, the generalization of KG embedding as well as the supplementary semantic structure of paths.

Iteratively Learning Embeddings and Rules for Knowledge Graph Reasoning

A novel framework IterE iteratively learning embeddings and rules is proposed, in which rules are learned fromembeddings with proper pruning strategy and embeddins are learning from existing triples and new triples inferred by rules.

Embedding Models for Knowledge Graphs Induced by Clusters of Relations and Background Knowledge

Embedding models have been successfully exploited for predictive tasks on Knowledge Graphs (KGs). We propose TRANSROWL-HRS, which aims at making KG embeddings more semantically aware by exploiting

References

SHOWING 1-10 OF 32 REFERENCES

Injecting Logical Background Knowledge into Embeddings for Relation Extraction

This paper introduces a paradigm for learning low-dimensional embeddings of entity-pairs and relations that combine the advantages of matrix factorization with first-order logic domain knowledge, and shows that this method is able to learn accurate extractors with little or no distant supervision alignments, while at the same time generalizing to textual patterns that do not appear in the formulae.

Embedding Entities and Relations for Learning and Inference in Knowledge Bases

It is found that embeddings learned from the bilinear objective are particularly good at capturing relational semantics and that the composition of relations is characterized by matrix multiplication.

Type-Constrained Representation Learning in Knowledge Graphs

This work integrated prior knowledge in form of type-constraints in various state of the art latent variable approaches and shows that prior knowledge on relation-types significantly improves these models up to 77% in link-prediction tasks.

Knowledge Base Completion Using Embeddings and Rules

This paper proposes a novel approach which incorporates rules seamlessly into embedding models for KB completion, and formulates inference as an integer linear programming (ILP) problem, with the objective function generated fromembedding models and the constraints translated from rules.

Translating Embeddings for Modeling Multi-relational Data

TransE is proposed, a method which models relationships by interpreting them as translations operating on the low-dimensional embeddings of the entities, which proves to be powerful since extensive experiments show that TransE significantly outperforms state-of-the-art methods in link prediction on two knowledge bases.

Traversing Knowledge Graphs in Vector Space

It is demonstrated that compositional training acts as a novel form of structural regularization, reliably improving performance across all base models (reducing errors by up to 43%) and achieving new state-of-the-art results.

Leveraging the schema in latent factor models for knowledge graph completion

This work proposes an unified method for leveraging additional schema knowledge in latent factor models, with the aim of learning more accurate link prediction models, and experimental evaluations show the effectiveness of the proposed method.

A Review of Relational Machine Learning for Knowledge Graphs

This paper provides a review of how statistical models can be “trained” on large knowledge graphs, and then used to predict new facts about the world (which is equivalent to predicting new edges in the graph) and how such statistical models of graphs can be combined with text-based information extraction methods for automatically constructing knowledge graphs from the Web.

Factorizing YAGO: scalable machine learning for linked data

This work presents an efficient approach to relational learning on LOD data, based on the factorization of a sparse tensor that scales to data consisting of millions of entities, hundreds of relations and billions of known facts, and shows how ontological knowledge can be incorporated in the factorizations to improve learning results and how computation can be distributed across multiple nodes.

A semantic matching energy function for learning with multi-relational data

A new neural network architecture designed to embed multi-relational graphs into a flexible continuous vector space in which the original data is kept and enhanced, demonstrating that it can scale up to tens of thousands of nodes and thousands of types of relation.