Translating Embeddings for Modeling Multi-relational Data
@inproceedings{Bordes2013TranslatingEF, title={Translating Embeddings for Modeling Multi-relational Data}, author={Antoine Bordes and Nicolas Usunier and Alberto Garc{\'i}a-Dur{\'a}n and Jason Weston and Oksana Yakhnenko}, booktitle={NIPS}, year={2013} }
We consider the problem of embedding entities and relationships of multi-relational data in low-dimensional vector spaces. [] Key Result Besides, it can be successfully trained on a large scale data set with 1M entities, 25k relationships and more than 17M training samples.
3,935 Citations
Learning Multi-Relational Semantics Using Neural-Embedding Models
- Computer ScienceArXiv
- 2014
The results show several interesting findings, enabling the design of a simple embedding model that achieves the new state-of-the-art performance on a popular knowledge base completion task evaluated on Freebase.
Topic-Based Embeddings for Learning from Large Knowledge Graphs
- Computer ScienceAISTATS
- 2016
A scalable probabilistic framework for learning from multi-relational data, given in form of entity-relation-entity triplets, with a potentially massive number of entities and relations, which yields excellent predictive performance and the interpretability of the topic-based embedding framework enables easy qualitative analyses.
Leveraging Lexical Resources for Learning Entity Embeddings in Multi-Relational Data
- Computer ScienceACL
- 2016
A simple approach that leverages the descriptions of entities or phrases available in lexical resources, in conjunction with distributional semantics, in order to derive a better initialization for training relational models results in significant new state-of-the-art performances on the WordNet dataset.
Initializing Entity Representations in Relational Models
- Computer Science
- 2016
This work proposes a simple trick that leverages the descriptions of entities or phrases available in lexical resources, in conjunction with distributional semantics, in order to derive a better initialization for training relational models, and applies it to the TransE model.
Analogical Inference for Multi-relational Embeddings
- Computer ScienceICML
- 2017
This paper proposes a novel framework for optimizing the latent representations with respect to thealogical properties of the embedded entities and relations, and offers an elegant unification of several well-known methods in multi-relational embedding.
Knowledge Graph Embedding for Hyper-Relational Data
- Computer Science
- 2017
The novel model TransHR is proposed, which transforms the hyper-relations in a pair of entities into an individual vector, serving as a translation between them, and significantly outperforms Trans (E, H, R) and CTransR, especially for hyperrelational data.
Knowledge Graph Completion for Hyper-relational Data
- Computer ScienceBigCom
- 2016
A novel model named TransHR is proposed, which transforms the vectors of hyper-relations between a pair of entities into an individual vector acting as a translation between them, which significantly outperforms Trans(E, H, R) and CTransR especially for hyper-relational data.
Embedding Multimodal Relational Data for Knowledge Base Completion
- Computer ScienceEMNLP
- 2018
This paper proposes multimodal knowledge base embeddings (MKBE) that use different neural encoders for this variety of observed data, and combines them with existing relational models to learnembeddings of the entities and multi-modal data.
Embedding Entities and Relations for Learning and Inference in Knowledge Bases
- Computer ScienceICLR
- 2015
It is found that embeddings learned from the bilinear objective are particularly good at capturing relational semantics and that the composition of relations is characterized by matrix multiplication.
TransEdge: Translating Relation-Contextualized Embeddings for Knowledge Graphs
- Computer ScienceSEMWEB
- 2019
A novel edge-centric embedding model TransEdge is proposed, which contextualizes relation representations in terms of specific head-tail entity pairs and interprets them as translations between entity embeddings.
References
SHOWING 1-10 OF 18 REFERENCES
A latent factor model for highly multi-relational data
- Computer ScienceNIPS
- 2012
This paper proposes a method for modeling large multi-relational datasets, with possibly thousands of relations, based on a bilinear structure, which captures various orders of interaction of the data and also shares sparse latent factors across different relations.
A Three-Way Model for Collective Learning on Multi-Relational Data
- Computer ScienceICML
- 2011
This work presents a novel approach to relational learning based on the factorization of a three-way tensor that is able to perform collective learning via the latent components of the model and provide an efficient algorithm to compute the factorizations.
A semantic matching energy function for learning with multi-relational data
- Computer ScienceMachine Learning
- 2013
A new neural network architecture designed to embed multi-relational graphs into a flexible continuous vector space in which the original data is kept and enhanced, demonstrating that it can scale up to tens of thousands of nodes and thousands of types of relation.
Connecting Language and Knowledge Bases with Embedding Models for Relation Extraction
- Computer ScienceEMNLP
- 2013
This paper proposes a novel approach for relation extraction from free text which is trained to jointly use information from the text and from existing knowledge. Our model is based on scoring…
Learning New Facts From Knowledge Bases With Neural Tensor Networks and Semantic Word Vectors
- Computer ScienceICLR
- 2013
A neural tensor network (NTN) model is introduced which predicts new relationship entries that can be added to the database and can classify unseen relationships in WordNet with an accuracy of 75.8%.
Factorizing YAGO: scalable machine learning for linked data
- Computer ScienceWWW
- 2012
This work presents an efficient approach to relational learning on LOD data, based on the factorization of a sparse tensor that scales to data consisting of millions of entities, hundreds of relations and billions of known facts, and shows how ontological knowledge can be incorporated in the factorizations to improve learning results and how computation can be distributed across multiple nodes.
Learning Structured Embeddings of Knowledge Bases
- Computer ScienceAAAI
- 2011
A learning process based on an innovative neural network architecture designed to embed any of these symbolic representations into a more flexible continuous vector space in which the original knowledge is kept and enhanced would allow data from any KB to be easily used in recent machine learning methods for prediction and information retrieval.
Relational learning via collective matrix factorization
- Computer ScienceKDD
- 2008
This model generalizes several existing matrix factorization methods, and therefore yields new large-scale optimization algorithms for these problems, which can handle any pairwise relational schema and a wide variety of error models.
Modelling Relational Data using Bayesian Clustered Tensor Factorization
- Computer ScienceNIPS
- 2009
The Bayesian Clustered Tensor Factorization (BCTF) model is introduced, which embeds a factorized representation of relations in a nonparametric Bayesian clustering framework that is fully Bayesian but scales well to large data sets.
Nonparametric Latent Feature Models for Link Prediction
- Computer ScienceNIPS
- 2009
This work pursues a similar approach with a richer kind of latent variable—latent features—using a Bayesian nonparametric approach to simultaneously infer the number of features at the same time the authors learn which entities have each feature, and combines these inferred features with known covariates in order to perform link prediction.