Learning Attention-based Embeddings for Relation Prediction in Knowledge Graphs

  title={Learning Attention-based Embeddings for Relation Prediction in Knowledge Graphs},
  author={Deepak Nathani and Jatin Chauhan and Charu Sharma and Manohar Kaul},
The recent proliferation of knowledge graphs (KGs) coupled with incomplete or partial information, in the form of missing relations (links) between entities, has fueled a lot of research on knowledge base completion (also known as relation prediction. [] Key Method Additionally, we also encapsulate relation clusters and multihop relations in our model. Our empirical study offers insights into the efficacy of our attention based model and we show marked performance gains in comparison to state of the art…

Knowledge Graph Embedding via Graph Attenuated Attention Networks

Graph Attenuated Attention networks (GAATs) are proposed, a novel representation method, which integrates an attenuated attention mechanism to assign different weight in different relation path and acquire the information from the neighborhoods, so entities and relations can be learned in any neighbors.

Knowledge Embedding Based Graph Convolutional Network

This paper proposes a novel framework, namely the Knowledge Embedding based Graph Convolutional Network (KE-GCN), which combines the power of GCNs in graph-based belief propagation and the strengths of advanced knowledge embedding methods, and goes beyond.

Relational Graph Neural Network with Hierarchical Attention for Knowledge Graph Completion

This work proposes a Relational Graph neural network with Hierarchical ATtention (RGHAT) for the KGC task and extensively validate the superiority of RGHAT against various state-of-the-art baselines.

Hierarchical Neighbor Propagation With Bidirectional Graph Attention Network for Relation Prediction

In this study, a novel bidirectional graph attention network (BiGAT) is proposed to learn the hierarchical neighbor propagation and achieves the competitive results in comparison to other SOTA methods.

A structure distinguishable graph attention network for knowledge base completion

The empirical research provides an effective solution to increase the discriminative power of graph attention networks, and the proposed SD-GAT shows significant improvement compared to the state-of-the-art methods on standard FB15K-237 and WN18RR datasets.

Meta-Learning Based Hyper-Relation Feature Modeling for Out-of-Knowledge-Base Embedding

A two-stage learning model referred as Hyper-Relation Feature Learning Network (HRFN) is proposed for effective out-of-knowledge-base embedding of knowledge graph entities using hyper-relation features meta-learned from the training set.

KRACL: Contrastive Learning with Graph Context Modeling for Sparse Knowledge Graph Completion

This work proposes a novel framework KRACL to alleviate the widespread sparsity in KGs with graph context and contrastive learning, and proposes the Knowledge Relational Attention Network (KRAT) to leverage the graph context by simultaneously projecting neighboring triples to different latent spaces and jointly aggregating messages with the attention mechanism.

A Birds Eye View on Knowledge Graph Embeddings, Software Libraries, Applications and Challenges

Existing KGC approaches are discussed, including the state-of-the-art Knowledge Graph Embeddings (KGE), not only on static graphs but also for the latest trends such as multimodal, temporal, and uncertain knowledge graphs.



Convolutional 2D Knowledge Graph Embeddings

ConvE, a multi-layer convolutional network model for link prediction, is introduced, and it is found that ConvE achieves state-of-the-art Mean Reciprocal Rank across all datasets.

Modeling Relational Data with Graph Convolutional Networks

It is shown that factorization models for link prediction such as DistMult can be significantly improved through the use of an R-GCN encoder model to accumulate evidence over multiple inference steps in the graph, demonstrating a large improvement of 29.8% on FB15k-237 over a decoder-only baseline.

Embedding Entities and Relations for Learning and Inference in Knowledge Bases

It is found that embeddings learned from the bilinear objective are particularly good at capturing relational semantics and that the composition of relations is characterized by matrix multiplication.

Modeling Relation Paths for Representation Learning of Knowledge Bases

This model considers relation paths as translations between entities for representation learning, and addresses two key challenges: (1) Since not all relation paths are reliable, it design a path-constraint resource allocation algorithm to measure the reliability of relation paths and (2) represents relation paths via semantic composition of relation embeddings.

Complex Embeddings for Simple Link Prediction

This work makes use of complex valued embeddings to solve the link prediction problem through latent factorization, and uses the Hermitian dot product, the complex counterpart of the standard dot product between real vectors.

Holographic Embeddings of Knowledge Graphs

Holographic embeddings (HolE) are proposed to learn compositional vector space representations of entire knowledge graphs to outperform state-of-the-art methods for link prediction on knowledge graphs and relational learning benchmark datasets.

A Novel Embedding Model for Knowledge Base Completion Based on Convolutional Neural Network

The model ConvKB advances state-of-the-art models by employing a convolutional neural network, so that it can capture global relationships and transitional characteristics between entities and relations in knowledge bases.

DeepPath: A Reinforcement Learning Method for Knowledge Graph Reasoning

A novel reinforcement learning framework for learning multi-hop relational paths is described, which uses a policy-based agent with continuous states based on knowledge graph embeddings, which reasons in a KG vector-space by sampling the most promising relation to extend its path.

Reasoning With Neural Tensor Networks for Knowledge Base Completion

An expressive neural tensor network suitable for reasoning over relationships between two entities given a subset of the knowledge base is introduced and performance can be improved when entities are represented as an average of their constituting word vectors.

Question Answering over Knowledge Base with Neural Attention Combining Global Knowledge Information

A neural attention-based model to represent the questions dynamically according to the different focuses of various candidate answer aspects is presented, and the global knowledge inside the underlying KB is leveraged, aiming at integrating the rich KB information into the representation of the answers.