• Corpus ID: 14941970

Translating Embeddings for Modeling Multi-relational Data

@inproceedings{Bordes2013TranslatingEF,
  title={Translating Embeddings for Modeling Multi-relational Data},
  author={Antoine Bordes and Nicolas Usunier and Alberto Garc{\'i}a-Dur{\'a}n and Jason Weston and Oksana Yakhnenko},
  booktitle={NIPS},
  year={2013}
}
We consider the problem of embedding entities and relationships of multi-relational data in low-dimensional vector spaces. [] Key Result Besides, it can be successfully trained on a large scale data set with 1M entities, 25k relationships and more than 17M training samples.

Figures and Tables from this paper

Learning Multi-Relational Semantics Using Neural-Embedding Models
TLDR
The results show several interesting findings, enabling the design of a simple embedding model that achieves the new state-of-the-art performance on a popular knowledge base completion task evaluated on Freebase.
Topic-Based Embeddings for Learning from Large Knowledge Graphs
TLDR
A scalable probabilistic framework for learning from multi-relational data, given in form of entity-relation-entity triplets, with a potentially massive number of entities and relations, which yields excellent predictive performance and the interpretability of the topic-based embedding framework enables easy qualitative analyses.
Leveraging Lexical Resources for Learning Entity Embeddings in Multi-Relational Data
TLDR
A simple approach that leverages the descriptions of entities or phrases available in lexical resources, in conjunction with distributional semantics, in order to derive a better initialization for training relational models results in significant new state-of-the-art performances on the WordNet dataset.
Initializing Entity Representations in Relational Models
TLDR
This work proposes a simple trick that leverages the descriptions of entities or phrases available in lexical resources, in conjunction with distributional semantics, in order to derive a better initialization for training relational models, and applies it to the TransE model.
Analogical Inference for Multi-relational Embeddings
TLDR
This paper proposes a novel framework for optimizing the latent representations with respect to thealogical properties of the embedded entities and relations, and offers an elegant unification of several well-known methods in multi-relational embedding.
Knowledge Graph Embedding for Hyper-Relational Data
TLDR
The novel model TransHR is proposed, which transforms the hyper-relations in a pair of entities into an individual vector, serving as a translation between them, and significantly outperforms Trans (E, H, R) and CTransR, especially for hyperrelational data.
Knowledge Graph Completion for Hyper-relational Data
TLDR
A novel model named TransHR is proposed, which transforms the vectors of hyper-relations between a pair of entities into an individual vector acting as a translation between them, which significantly outperforms Trans(E, H, R) and CTransR especially for hyper-relational data.
Embedding Multimodal Relational Data for Knowledge Base Completion
TLDR
This paper proposes multimodal knowledge base embeddings (MKBE) that use different neural encoders for this variety of observed data, and combines them with existing relational models to learnembeddings of the entities and multi-modal data.
Embedding Entities and Relations for Learning and Inference in Knowledge Bases
TLDR
It is found that embeddings learned from the bilinear objective are particularly good at capturing relational semantics and that the composition of relations is characterized by matrix multiplication.
TransEdge: Translating Relation-Contextualized Embeddings for Knowledge Graphs
TLDR
A novel edge-centric embedding model TransEdge is proposed, which contextualizes relation representations in terms of specific head-tail entity pairs and interprets them as translations between entity embeddings.
...
...

References

SHOWING 1-10 OF 18 REFERENCES
A Three-Way Model for Collective Learning on Multi-Relational Data
TLDR
This work presents a novel approach to relational learning based on the factorization of a three-way tensor that is able to perform collective learning via the latent components of the model and provide an efficient algorithm to compute the factorizations.
A semantic matching energy function for learning with multi-relational data
TLDR
A new neural network architecture designed to embed multi-relational graphs into a flexible continuous vector space in which the original data is kept and enhanced, demonstrating that it can scale up to tens of thousands of nodes and thousands of types of relation.
Connecting Language and Knowledge Bases with Embedding Models for Relation Extraction
This paper proposes a novel approach for relation extraction from free text which is trained to jointly use information from the text and from existing knowledge. Our model is based on scoring
Learning New Facts From Knowledge Bases With Neural Tensor Networks and Semantic Word Vectors
TLDR
A neural tensor network (NTN) model is introduced which predicts new relationship entries that can be added to the database and can classify unseen relationships in WordNet with an accuracy of 75.8%.
Factorizing YAGO: scalable machine learning for linked data
TLDR
This work presents an efficient approach to relational learning on LOD data, based on the factorization of a sparse tensor that scales to data consisting of millions of entities, hundreds of relations and billions of known facts, and shows how ontological knowledge can be incorporated in the factorizations to improve learning results and how computation can be distributed across multiple nodes.
Learning Structured Embeddings of Knowledge Bases
TLDR
A learning process based on an innovative neural network architecture designed to embed any of these symbolic representations into a more flexible continuous vector space in which the original knowledge is kept and enhanced would allow data from any KB to be easily used in recent machine learning methods for prediction and information retrieval.
Relational learning via collective matrix factorization
TLDR
This model generalizes several existing matrix factorization methods, and therefore yields new large-scale optimization algorithms for these problems, which can handle any pairwise relational schema and a wide variety of error models.
Modelling Relational Data using Bayesian Clustered Tensor Factorization
TLDR
The Bayesian Clustered Tensor Factorization (BCTF) model is introduced, which embeds a factorized representation of relations in a nonparametric Bayesian clustering framework that is fully Bayesian but scales well to large data sets.
Nonparametric Latent Feature Models for Link Prediction
TLDR
This work pursues a similar approach with a richer kind of latent variable—latent features—using a Bayesian nonparametric approach to simultaneously infer the number of features at the same time the authors learn which entities have each feature, and combines these inferred features with known covariates in order to perform link prediction.
Distributed Representations of Words and Phrases and their Compositionality
TLDR
This paper presents a simple method for finding phrases in text, and shows that learning good vector representations for millions of phrases is possible and describes a simple alternative to the hierarchical softmax called negative sampling.
...
...