Corpus ID: 4328400

Convolutional 2D Knowledge Graph Embeddings

@inproceedings{Dettmers2018Convolutional2K,
  title={Convolutional 2D Knowledge Graph Embeddings},
  author={Tim Dettmers and Pasquale Minervini and Pontus Stenetorp and S. Riedel},
  booktitle={AAAI},
  year={2018}
}
Link prediction for knowledge graphs is the task of predicting missing relationships between entities. Previous work on link prediction has focused on shallow, fast models which can scale to large knowledge graphs. However, these models learn less expressive features than deep, multi-layer models -- which potentially limits performance. In this work, we introduce ConvE, a multi-layer convolutional network model for link prediction, and report state-of-the-art results for several established… Expand
Learning Attention-based Embeddings for Relation Prediction in Knowledge Graphs
TLDR
This paper proposes a novel attention-based feature embedding that captures both entity and relation features in any given entity’s neighborhood and encapsulate relation clusters and multi-hop relations in the model. Expand
Decompressing Knowledge Graph Representations for Link Prediction
TLDR
This paper proposes, DeCom, a simple but effective mechanism to boost the performance of existing link predictors such as DistMult, ComplEx, etc, through extracting more expressive features while preventing overfitting by adding just a few extra parameters. Expand
Convolutional Hypercomplex Embeddings for Link Prediction
TLDR
A composition of convolution operations with hypercomplex multiplications for link prediction using quaternion and octonion extensions of previous state-of-the-art approaches, including DistMult and ComplEx. Expand
Hypernetwork Knowledge Graph Embeddings
TLDR
It is demonstrated that convolution simply offers a convenient computational means of introducing sparsity and parameter tying to find an effective trade-off between non-linear expressiveness and the number of parameters to learn. Expand
CNN-based Dual-Chain Models for Knowledge Graph Learning
TLDR
A new convolutional neural network (CNN)-based dual-chain model, which incorporates descriptions of entities and learns a second set of entity embeddings from the descriptions, which is able to effectively handle zero-shot problems. Expand
TransGCN: Coupling Transformation Assumptions with Graph Convolutional Networks for Link Prediction
TLDR
This work proposes a unified GCN framework, named TransGCN, to address link prediction, and introduces a novel way of representing heterogeneous neighborhood by introducing transformation assumptions on the relationship between the subject, the relation, and the object of a triple. Expand
Modeling Relational Data with Graph Convolutional Networks
TLDR
It is shown that factorization models for link prediction such as DistMult can be significantly improved through the use of an R-GCN encoder model to accumulate evidence over multiple inference steps in the graph, demonstrating a large improvement of 29.8% on FB15k-237 over a decoder-only baseline. Expand
Attention Relational Graph Convolution Networks for Relation Prediction in Knowledge Graphs
TLDR
An attention weighted relational graph convolutional network (denoted as AWR-GCN) is proposed, which is used as an encoder of the encoder-decoder model for relationship prediction, and the decoder is a linear model. Expand
Neighborhood Aggregation Embedding Model for Link Prediction in Knowledge Graphs
TLDR
NAE (neighborhood aggregation embedding model) is proposed, a novel approach for link prediction that outperforms several state-of-the-art methods on benchmark datasets and proposes a highly parameter efficient model NAE-S by simplifying the predictor, which can obtain competitive performance with fewer parameters. Expand
NASE:: Learning Knowledge Graph Embedding for Link Prediction via Neural Architecture Search
TLDR
This paper proposes a novel Neural Architecture Search (NAS) framework for the link prediction task, which enables it to take advantage of several mainstream model families, and thus it can potentially achieve better performance. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 47 REFERENCES
Modeling Relational Data with Graph Convolutional Networks
TLDR
It is shown that factorization models for link prediction such as DistMult can be significantly improved through the use of an R-GCN encoder model to accumulate evidence over multiple inference steps in the graph, demonstrating a large improvement of 29.8% on FB15k-237 over a decoder-only baseline. Expand
Modeling Relational Data with Graph Convolutional Networks
Knowledge graphs enable a wide variety of applications, including question answering and information retrieval. Despite the great effort invested in their creation and maintenance, even the largestExpand
Type-Constrained Representation Learning in Knowledge Graphs
TLDR
This work integrated prior knowledge in form of type-constraints in various state of the art latent variable approaches and shows that prior knowledge on relation-types significantly improves these models up to 77% in link-prediction tasks. Expand
Embedding Entities and Relations for Learning and Inference in Knowledge Bases
TLDR
It is found that embeddings learned from the bilinear objective are particularly good at capturing relational semantics and that the composition of relations is characterized by matrix multiplication. Expand
Translating Embeddings for Modeling Multi-relational Data
TLDR
TransE is proposed, a method which models relationships by interpreting them as translations operating on the low-dimensional embeddings of the entities, which proves to be powerful since extensive experiments show that TransE significantly outperforms state-of-the-art methods in link prediction on two knowledge bases. Expand
Complex Embeddings for Simple Link Prediction
TLDR
This work makes use of complex valued embeddings to solve the link prediction problem through latent factorization, and uses the Hermitian dot product, the complex counterpart of the standard dot product between real vectors. Expand
A Review of Relational Machine Learning for Knowledge Graphs
TLDR
This paper provides a review of how statistical models can be “trained” on large knowledge graphs, and then used to predict new facts about the world (which is equivalent to predicting new edges in the graph) and how such statistical models of graphs can be combined with text-based information extraction methods for automatically constructing knowledge graphs from the Web. Expand
Holographic Embeddings of Knowledge Graphs
TLDR
Holographic embeddings are proposed to learn compositional vector space representations of entire knowledge graphs to outperform state-of-the-art methods for link prediction on knowledge graphs and relational learning benchmark datasets. Expand
Analogical Inference for Multi-relational Embeddings
TLDR
This paper proposes a novel framework for optimizing the latent representations with respect to thealogical properties of the embedded entities and relations, and offers an elegant unification of several well-known methods in multi-relational embedding. Expand
STransE: a novel embedding model of entities and relationships in knowledge bases
TLDR
STransE is a simple combination of the SE and TransE models, but it obtains better link prediction performance on two benchmark datasets than previous embedding models, and can serve as a new baseline for the more complex models in the link prediction task. Expand
...
1
2
3
4
5
...