Corpus ID: 3875633

KBlrn: End-to-End Learning of Knowledge Base Representations with Latent, Relational, and Numerical Features

@article{GarcaDurn2018KBlrnEL,
  title={KBlrn: End-to-End Learning of Knowledge Base Representations with Latent, Relational, and Numerical Features},
  author={Alberto Garc{\'i}a-Dur{\'a}n and Mathias Niepert},
  journal={ArXiv},
  year={2018},
  volume={abs/1709.04676}
}
We present KBLRN, a framework for end-to-end learning of knowledge base representations from latent, relational, and numerical features. [...] Key Method We contribute a novel data sets enriching commonly used knowledge base completion benchmarks with numerical features. The data sets are available under a permissive BSD-3 license. We also investigate the impact numerical features have on the KB completion performance of KBLRN.Expand
Embedding Multimodal Relational Data for Knowledge Base Completion
TLDR
This paper proposes multimodal knowledge base embeddings (MKBE) that use different neural encoders for this variety of observed data, and combines them with existing relational models to learnembeddings of the entities and multi-modal data. Expand
Embedding Multimodal Relational Data
TLDR
This work proposes a new approach to represent relational triples, consisting of a subject entity, relation, and an object entity, by estimating fixed, low-dimensional representations for each entity and relation from observations, thus encode the uncertainty and infer missing facts accurately and efficiently. Expand
An overview of embedding models of entities and relationships for knowledge base completion
TLDR
This paper serves as a comprehensive overview of embedding models of entities and relationships for knowledge base completion, summarizing up-to-date experimental results on standard benchmark datasets. Expand
TransBidiFilter: Knowledge Embedding Based on a Bidirectional Filter
TLDR
A knowledge embedding model based on a bidirectional filter called TransBidiFilter that achieves better automatic completion ability by modifying the standard translation-based loss function and performs better than state-of-the-art baselines of semantic discriminate models on most indicators on many datasets. Expand
Learning Numerical Attributes in Knowledge Bases
TLDR
This work argues that the numerical values associated with entities explain, to some extent, the relational structure of the knowledge base, and leverages knowledge base embedding methods to learn representations that are useful predictors for the numerical attributes. Expand
Differentiable Reasoning on Large Knowledge Bases and Natural Language
TLDR
Greedy NTPs are proposed, an extension to NTP's addressing their complexity and scalability limitations, thus making them applicable to real-world datasets and a novel approach for jointly reasoning over KBs and textual mentions by embedding logic facts and natural language sentences in a shared embedding space. Expand
Application of concepts of neighbours to knowledge graph completion1
TLDR
This work proposes a new approach that does not need a training phase, and that can provide interpretable explanations for each inference, that relies on the computation of Concepts of Nearest Neighbours to identify clusters of similar entities based on common graph patterns. Expand
Embedding cardinality constraints in neural link predictors
TLDR
A new regularisation approach is proposed to incorporate relation cardinality constraints to any existing neural link predictor without affecting their efficiency or scalability, and structuring the embeddings space to respect commonsense cardinality assumptions resulting in better representations. Expand
Text-Enhanced Knowledge Representation Learning Based on Gated Convolutional Networks
TLDR
A text-enhanced KG model based on gated convolution network (GConvTE), which can learn entity descriptions and symbol triples jointly by feature fusion, and achieves better link performance than previous state-of-art embedding models on two benchmark datasets. Expand
Rule-Guided Compositional Representation Learning on Knowledge Graphs
TLDR
This paper proposes a novel Rule and Path-based Joint Embedding (RPJE) scheme, which takes full advantage of the explainability and accuracy of logic rules, the generalization of KG embedding as well as the supplementary semantic structure of paths. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 39 REFERENCES
Embedding Multimodal Relational Data
TLDR
This work proposes a new approach to represent relational triples, consisting of a subject entity, relation, and an object entity, by estimating fixed, low-dimensional representations for each entity and relation from observations, thus encode the uncertainty and infer missing facts accurately and efficiently. Expand
Discriminative Gaifman Models
TLDR
Gaifman models sample neighborhoods of knowledge bases so as to make the learned relational models more robust to missing objects and relations which is a common situation in open-world KBs. Expand
Representing Text for Joint Embedding of Text and Knowledge Bases
TLDR
A model is proposed that captures the compositional structure of textual relations, and jointly optimizes entity, knowledge base, and textual relation representations, and significantly improves performance over a model that does not share parameters among textual relations with common sub-structure. Expand
Combining Two And Three-Way Embeddings Models for Link Prediction in Knowledge Bases
TLDR
This paper proposes TATEC, a happy medium obtained by complementing a high-capacity model with a simpler one, both pre-trained separately and then combined, and shows that this approach outperforms existing methods on different types of relationships by achieving state-of-the-art results on four benchmarks of the literature. Expand
Reasoning With Neural Tensor Networks for Knowledge Base Completion
TLDR
An expressive neural tensor network suitable for reasoning over relationships between two entities given a subset of the knowledge base is introduced and performance can be improved when entities are represented as an average of their constituting word vectors. Expand
Modeling Relational Data with Graph Convolutional Networks
TLDR
It is shown that factorization models for link prediction such as DistMult can be significantly improved through the use of an R-GCN encoder model to accumulate evidence over multiple inference steps in the graph, demonstrating a large improvement of 29.8% on FB15k-237 over a decoder-only baseline. Expand
Modeling Relational Data with Graph Convolutional Networks
Knowledge graphs enable a wide variety of applications, including question answering and information retrieval. Despite the great effort invested in their creation and maintenance, even the largestExpand
Learning Multi-Relational Semantics Using Neural-Embedding Models
TLDR
The results show several interesting findings, enabling the design of a simple embedding model that achieves the new state-of-the-art performance on a popular knowledge base completion task evaluated on Freebase. Expand
Knowledge Base Completion: Baselines Strike Back
TLDR
It is shown that the accuracy of almost all models published on the FB15k can be outperformed by an appropriately tuned baseline — the authors' reimplementation of the DistMult model. Expand
Translating Embeddings for Modeling Multi-relational Data
TLDR
TransE is proposed, a method which models relationships by interpreting them as translations operating on the low-dimensional embeddings of the entities, which proves to be powerful since extensive experiments show that TransE significantly outperforms state-of-the-art methods in link prediction on two knowledge bases. Expand
...
1
2
3
4
...