# A Joint Framework for Inductive Representation Learning and Explainable Reasoning in Knowledge Graphs

@article{Bhowmik2020AJF, title={A Joint Framework for Inductive Representation Learning and Explainable Reasoning in Knowledge Graphs}, author={Rajarshi Bhowmik and Gerard de Melo}, journal={ArXiv}, year={2020}, volume={abs/2005.00637} }

Despite their large-scale coverage, existing cross-domain knowledge graphs invariably suffer from inherent incompleteness and sparsity, necessitating link prediction that requires inferring a target entity, given a source entity and a query relation. Recent approaches can broadly be classified into two categories: embedding-based approaches and path-based approaches. In contrast to embedding-based approaches, which operate in an uninterpretable latent semantic vector space of entities and… Expand

#### References

SHOWING 1-10 OF 32 REFERENCES

Compositional Learning of Embeddings for Relation Paths in Knowledge Base and Text

- Computer Science
- ACL
- 2016

This paper proposes the first exact dynamic programming algorithm which enables efficient incorporation of all relation paths of bounded length, while modeling both relation types and intermediate nodes in the compositional path representations. Expand

Modeling Relational Data with Graph Convolutional Networks

- Computer Science, Mathematics
- ESWC
- 2018

It is shown that factorization models for link prediction such as DistMult can be significantly improved through the use of an R-GCN encoder model to accumulate evidence over multiple inference steps in the graph, demonstrating a large improvement of 29.8% on FB15k-237 over a decoder-only baseline. Expand

Compositional Vector Space Models for Knowledge Base Completion

- Computer Science, Mathematics
- ACL
- 2015

This paper presents an approach that reasons about conjunctions of multi-hop relations non-atomically, composing the implications of a path using a recurrent neural network (RNN) that takes as inputs vector embeddings of the binary relation in the path. Expand

Embedding Entities and Relations for Learning and Inference in Knowledge Bases

- Computer Science
- ICLR
- 2015

It is found that embeddings learned from the bilinear objective are particularly good at capturing relational semantics and that the composition of relations is characterized by matrix multiplication. Expand

Learning Entity and Relation Embeddings for Knowledge Graph Completion

- Computer Science
- AAAI
- 2015

TransR is proposed to build entity and relation embeddings in separate entity space and relation spaces to build translations between projected entities and to evaluate the models on three tasks including link prediction, triple classification and relational fact extraction. Expand

Traversing Knowledge Graphs in Vector Space

- Computer Science, Mathematics
- EMNLP
- 2015

It is demonstrated that compositional training acts as a novel form of structural regularization, reliably improving performance across all base models (reducing errors by up to 43%) and achieving new state-of-the-art results. Expand

Knowledge Transfer for Out-of-Knowledge-Base Entities: A Graph Neural Network Approach

- Computer Science
- ArXiv
- 2017

This paper uses graph neural networks (Graph-NNs) to compute the embeddings of OOKB entities, exploiting the limited auxiliary knowledge provided at test time to solve the out-of-knowledge-base (OOKB) entity problem in KBC. Expand

Improving Learning and Inference in a Large Knowledge-Base using Latent Syntactic Cues

- Computer Science
- EMNLP
- 2013

For the first time, it is demonstrated that addition of edges labeled with latent features mined from a large dependency parsed corpus of 500 million Web documents can significantly outperform previous PRAbased approaches on the KB inference task. Expand

An overview of embedding models of entities and relationships for knowledge base completion

- Computer Science
- ArXiv
- 2017

This paper serves as a comprehensive overview of embedding models of entities and relationships for knowledge base completion, summarizing up-to-date experimental results on standard benchmark datasets. Expand

Inductive Representation Learning on Large Graphs

- Computer Science, Mathematics
- NIPS
- 2017

GraphSAGE is presented, a general, inductive framework that leverages node feature information (e.g., text attributes) to efficiently generate node embeddings for previously unseen data and outperforms strong baselines on three inductive node-classification benchmarks. Expand