Corpus ID: 2949428

Learning Entity and Relation Embeddings for Knowledge Graph Completion

@inproceedings{Lin2015LearningEA,
  title={Learning Entity and Relation Embeddings for Knowledge Graph Completion},
  author={Yankai Lin and Zhiyuan Liu and Maosong Sun and Yang Liu and Xuan Zhu},
  booktitle={AAAI},
  year={2015}
}
Knowledge graph completion aims to perform link prediction between entities. [...] Key Method Afterwards, we learn embeddings by first projecting entities from entity space to corresponding relation space and then building translations between projected entities. In experiments, we evaluate our models on three tasks including link prediction, triple classification and relational fact extraction. Experimental results show significant and consistent improvements compared to state-of-the-art baselines including…Expand
Separate Spaces Framework for Entities and Relations Embeddings in Knowledge Graph
TLDR
A new relation embedding by distinguishing head entity and tail entity is proposed, possible that an entity can have different semantic meanings in head and tail positions, and can have good performance in typical knowledge graphs tasks, such as link prediction. Expand
Cross-Projection for Embedding Translation in Knowledge Graph Completion
TLDR
A new model named Cross-projected Translation Embedding model (CpTE) based on the translation theory, which assumes that relation is affected by related entities and transmits effect between end-to-end entities. Expand
Incorporating Domain and Range of Relations for Knowledge Graph Completion
TLDR
Experimental results show that the TransX\(_C\) method outperforms the corresponding translation-based model, indicating the effectiveness of considering domain and range of relations into link prediction. Expand
TransT: Type-Based Multiple Embedding Representations for Knowledge Graph Completion
TLDR
This work proposes an approach that integrates the structured information and entity types which describe the categories of entities and utilizes type-based semantic similarity of the related entities and relations to capture prior distributions of entity and relations. Expand
Translating Embeddings for Knowledge Graph Completion Utilizing Type Correlations
Knowledge graph embedding aims to project entities and relations into a continuous low-dimensional space. Typical structure-based embedding methods, such as TransE, are widely used on knowledge graphExpand
Entity-Context and Relation-Context Combined Knowledge Graph Embeddings
  • Yong Wu, Wei Li, Xiaoming Fan, Binjun Wang
  • Computer Science
  • Arabian Journal for Science and Engineering
  • 2021
TLDR
A novel knowledge graph embedding model, namely Entity-context and Relation-context combined Knowledge Graph Embeddings (ERKE), in which each relation is defined as a rotation with variable moduli from the source entity to the target entity in the polar coordinate system, is proposed. Expand
ProjR: Embedding Structure Diversity for Knowledge Graph Completion
TLDR
A new embedding method ProjR is proposed which combines TransR and ProjE together to achieve diverse representations for entities in different relation contexts and different entity positions by defining a unique combination operator for each relation. Expand
TransMS: Knowledge Graph Embedding for Complex Relations by Multidirectional Semantics
TLDR
This model translates and transmits multidirectional semantics of head/tail entities and relations to tail/head entities with nonlinear functions and the semantics from entities to relations with linear bias vectors, which results in its better scalability in large-scale knowledge graph. Expand
Learning to Compose Relational Embeddings in Knowledge Graphs
TLDR
This work introduces relation composition as the task of inferring embeddings for unseen relations by combining existing relations in a knowledge graph and proposes a supervised method to compose relational embedDings for novel relations using pre-trained relation embeddins for existing relations. Expand
Knowledge Graph Embedding by Dynamic Translation
TLDR
This paper proposes a novel dynamic translation principle which supports flexible translation between the embeddings of entities and relations and uses this principle to improve the TransE, TransR and TranSparse models respectively and build new models named TransE-DT, transR-DT and TraSparse-DT correspondingly. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 23 REFERENCES
Modeling Relations and Their Mentions without Labeled Text
TLDR
A novel approach to distant supervision that can alleviate the problem of noisy patterns that hurt precision by using a factor graph and applying constraint-driven semi-supervision to train this model without any knowledge about which sentences express the relations in the authors' training KB. Expand
Reasoning With Neural Tensor Networks for Knowledge Base Completion
TLDR
An expressive neural tensor network suitable for reasoning over relationships between two entities given a subset of the knowledge base is introduced and performance can be improved when entities are represented as an average of their constituting word vectors. Expand
Translating Embeddings for Modeling Multi-relational Data
TLDR
TransE is proposed, a method which models relationships by interpreting them as translations operating on the low-dimensional embeddings of the entities, which proves to be powerful since extensive experiments show that TransE significantly outperforms state-of-the-art methods in link prediction on two knowledge bases. Expand
Knowledge Graph Embedding by Translating on Hyperplanes
TLDR
This paper proposes TransH which models a relation as a hyperplane together with a translation operation on it and can well preserve the above mapping properties of relations with almost the same model complexity of TransE. Expand
Multi-instance Multi-label Learning for Relation Extraction
TLDR
This work proposes a novel approach to multi-instance multi-label learning for RE, which jointly models all the instances of a pair of entities in text and all their labels using a graphical model with latent variables that performs competitively on two difficult domains. Expand
Connecting Language and Knowledge Bases with Embedding Models for Relation Extraction
This paper proposes a novel approach for relation extraction from free text which is trained to jointly use information from the text and from existing knowledge. Our model is based on scoringExpand
Distant supervision for relation extraction without labeled data
TLDR
This work investigates an alternative paradigm that does not require labeled corpora, avoiding the domain dependence of ACE-style algorithms, and allowing the use of corpora of any size. Expand
A semantic matching energy function for learning with multi-relational data
TLDR
A new neural network architecture designed to embed multi-relational graphs into a flexible continuous vector space in which the original data is kept and enhanced, demonstrating that it can scale up to tens of thousands of nodes and thousands of types of relation. Expand
Knowledge-Based Weak Supervision for Information Extraction of Overlapping Relations
TLDR
A novel approach for multi-instance learning with overlapping relations that combines a sentence-level extraction model with a simple, corpus-level component for aggregating the individual facts is presented. Expand
Learning Structured Embeddings of Knowledge Bases
TLDR
A learning process based on an innovative neural network architecture designed to embed any of these symbolic representations into a more flexible continuous vector space in which the original knowledge is kept and enhanced would allow data from any KB to be easily used in recent machine learning methods for prediction and information retrieval. Expand
...
1
2
3
...