End-to-end Structure-Aware Convolutional Networks for Knowledge Base Completion

@article{Shang2019EndtoendSC,
  title={End-to-end Structure-Aware Convolutional Networks for Knowledge Base Completion},
  author={Chao Shang and Yun Tang and Jing Huang and Jinbo Bi and Xiaodong He and Bowen Zhou},
  journal={Proceedings of the ... AAAI Conference on Artificial Intelligence. AAAI Conference on Artificial Intelligence},
  year={2019},
  volume={33},
  pages={
          3060-3067
        }
}
  • Chao Shang, Yun Tang, Bowen Zhou
  • Published 11 November 2018
  • Computer Science
  • Proceedings of the ... AAAI Conference on Artificial Intelligence. AAAI Conference on Artificial Intelligence
Knowledge graph embedding has been an active research topic for knowledge base completion, with progressive improvement from the initial TransE, TransH, DistMult et al to the current state-of-the-art ConvE. ConvE uses 2D convolution over embeddings and multiple layers of nonlinear features to model knowledge graphs. [] Key Method SACN consists of an encoder of a weighted graph convolutional network (WGCN), and a decoder of a convolutional network called Conv-TransE. WGCN utilizes knowledge graph node…

Figures and Tables from this paper

Knowledge Graph Completion Based on Graph Representation and Probability Model

TLDR
A novel graph representation model based on SACN---property graph convolutional network called PGCN, which treats the knowledge graph as a property graph, regarding the initial embedding vector of entities and relations as the property of nodes and edges.

End-to-end Relation-Enhanced Learnable Graph Self-attention Network for Knowledge Graphs Embedding

TLDR
The proposed method of end-to-end relation-enhanced learnable graph self-attention network for knowledge graphs embedding facilitates more comprehensive representation of knowledge information than the existing methods, in terms of Hits@10 and MRR.

Knowledge Embedding Based Graph Convolutional Network

TLDR
This paper proposes a novel framework, namely the Knowledge Embedding based Graph Convolutional Network (KE-GCN), which combines the power of GCNs in graph-based belief propagation and the strengths of advanced knowledge embedding methods, and goes beyond.

Knowledge Transfer for Out-of-Knowledge-Base Entities: Improving Graph-Neural-Network-Based Embedding Using Convolutional Layers

TLDR
A parameter-efficient embedding model that combines the benefits of a graph neural network (GNN) and a convolutional Neural Network (CNN) to solve the KBC task with OOKB entities and has learnable weights that adapt based on information from neighbors.

Graph Attention Networks With Local Structure Awareness for Knowledge Graph Completion

TLDR
This work proposes LSA-GAT, a graph attention network with a novel neighborhood aggregation strategy for knowledge graph completion that can take special local structures into account, and derive a sophisticated representation covering both the semantic and structural information.

A structure distinguishable graph attention network for knowledge base completion

TLDR
The empirical research provides an effective solution to increase the discriminative power of graph attention networks, and the proposed SD-GAT shows significant improvement compared to the state-of-the-art methods on standard FB15K-237 and WN18RR datasets.

Association Rules Enhanced Knowledge Graph Attention Network

ComDensE : Combined Dense Embedding of Relation-aware and Common Features for Knowledge Graph Completion

TLDR
This paper takes a different architectural view and proposes ComDensE which combines relation-aware and common features using dense neural networks and conducts an extensive ablation study to examine the effects of the relation- aware layer and the common layer of the Com DensE.

Neighborhood aggregation based graph attention networks for open-world knowledge graph reasoning

TLDR
This work presents an attention-based method named as NAKGR, which leverages neighborhood information to generate entities and relations representations and performs well on the closed-world reasoning tasks.

A Knowledge Graph Embedding Method Based on Neural Network

TLDR
An effective KGE model based on neural network that converts the triple of the KG into a sentence and can effectively improve the accuracy of link prediction, achieving better results compared with other baseline models.
...

References

SHOWING 1-10 OF 35 REFERENCES

Convolutional 2D Knowledge Graph Embeddings

TLDR
ConvE, a multi-layer convolutional network model for link prediction, is introduced, and it is found that ConvE achieves state-of-the-art Mean Reciprocal Rank across all datasets.

Modeling Relational Data with Graph Convolutional Networks

TLDR
It is shown that factorization models for link prediction such as DistMult can be significantly improved through the use of an R-GCN encoder model to accumulate evidence over multiple inference steps in the graph, demonstrating a large improvement of 29.8% on FB15k-237 over a decoder-only baseline.

A Novel Embedding Model for Knowledge Base Completion Based on Convolutional Neural Network

TLDR
The model ConvKB advances state-of-the-art models by employing a convolutional neural network, so that it can capture global relationships and transitional characteristics between entities and relations in knowledge bases.

Graph Convolutional Neural Networks for Web-Scale Recommender Systems

TLDR
A novel method based on highly efficient random walks to structure the convolutions and a novel training strategy that relies on harder-and-harder training examples to improve robustness and convergence of the model are developed.

Edge Attention-based Multi-Relational Graph Convolutional Networks

TLDR
The proposed GCN model, which is called edge attention-based multi-relational GCN (EAGCN), jointly learns attention weights and node features in graph convolution, and exploits correspondence between bonds in different molecules.

Knowledge Graph Embedding via Dynamic Mapping Matrix

TLDR
A more fine-grained model named TransD, which is an improvement of TransR/CTransR, which not only considers the diversity of relations, but also entities, which makes it can be applied on large scale graphs.

Inductive Representation Learning on Large Graphs

TLDR
GraphSAGE is presented, a general, inductive framework that leverages node feature information (e.g., text attributes) to efficiently generate node embeddings for previously unseen data and outperforms strong baselines on three inductive node-classification benchmarks.

Learning Entity and Relation Embeddings for Knowledge Graph Completion

TLDR
TransR is proposed to build entity and relation embeddings in separate entity space and relation spaces by first projecting entities from entity space to corresponding relation space and then building translations between projected entities.

Towards Understanding the Geometry of Knowledge Graph Embeddings

TLDR
This work begins a study to analyze the geometry of KG embeddings and correlate it with task performance and other hyperparameters, and is likely the first study of its kind.

Embedding Entities and Relations for Learning and Inference in Knowledge Bases

TLDR
It is found that embeddings learned from the bilinear objective are particularly good at capturing relational semantics and that the composition of relations is characterized by matrix multiplication.