A shallow neural model for relation prediction

@article{Demir2021ASN,
  title={A shallow neural model for relation prediction},
  author={Caglar Demir and Diego Moussallem and A. N. Ngomo},
  journal={2021 IEEE 15th International Conference on Semantic Computing (ICSC)},
  year={2021},
  pages={179-182}
}
Knowledge graph completion refers to predicting missing triples. Most approaches achieve this goal by predicting entities, given an entity and a relation. We predict missing triples via the relation prediction. To this end, we frame the relation prediction problem as a multi-label classification problem and propose a shallow neural model (SHALLOM) that accurately infers missing relations from entities. Shallom is analogous to C-BOW as both approaches predict a central token (p) given… Expand

Figures and Tables from this paper

Out-of-Vocabulary Entities in Link Prediction
TLDR
This work spotted a further common limitation of three of the benchmarks commonly used for evaluating link prediction approaches: out-of-vocabulary entities in the test and validation sets and provides an implementation of an approach for spotting and removing such entities and corrected versions of the datasets WN18RR, FB15K-237, and YAGO310. Expand
DRILL- Deep Reinforcement Learning for Refinement Operators in ALC
TLDR
This work proposes Drill—a novel class expression learning approach that uses a convolutional deep Q-learning model to steer its search and suggests that Drill converges to goal states at least 2.7× faster than state-of-the-art models on all benchmark datasets. Expand
DRILL-- Deep Reinforcement Learning for Refinement Operators in $\mathcal{ALC}$
TLDR
This work proposes Drill—a novel class expression learning approach that uses a convolutional deep Q-learning model to steer its search and suggests that Drill converges to goal states at least 2.7× faster than state-of-the-art models on all benchmark datasets. Expand
Convolutional Complex Knowledge Graph Embeddings
TLDR
This paper presents a new approach called ConEx, which infers missing links by leveraging the composition of a 2D convolution with a Hermitian inner product of complex-valued embedding vectors for predicting missing links. Expand

References

SHOWING 1-10 OF 33 REFERENCES
Relation prediction in knowledge graph by Multi-Label Deep Neural Network
TLDR
A simple architecture model with emphasis on relation prediction by using a Multi-Label Deep Neural Network (DNN), and developed KGML, which is more accurate than TransE and TransR and faster than PTransE. Expand
Convolutional 2D Knowledge Graph Embeddings
TLDR
ConvE, a multi-layer convolutional network model for link prediction, is introduced and it is found that ConvE achieves state-of-the-art Mean Reciprocal Rank across most datasets. Expand
Predicting relations of embedded RDF entities by Deep Neural Network
TLDR
Experimental results showed that predictions by RDFDNN are more accurate than those by TransE and TransR, and its accuracy is comparable to that of DKRL which uses both RDF triples and entity descriptions for learning. Expand
Representation Learning of Knowledge Graphs with Entity Descriptions
TLDR
Experimental results on real-world datasets show that, the proposed novel RL method for knowledge graphs outperforms other baselines on the two tasks, especially under the zero-shot setting, which indicates that the method is capable of building representations for novel entities according to their descriptions. Expand
Embedding Entities and Relations for Learning and Inference in Knowledge Bases
TLDR
It is found that embeddings learned from the bilinear objective are particularly good at capturing relational semantics and that the composition of relations is characterized by matrix multiplication. Expand
SSP: Semantic Space Projection for Knowledge Graph Embedding with Text Descriptions
TLDR
A semantic representation method for knowledge graph (KSR) is proposed, which imposes a two-level hierarchical generative process that globally extracts many aspects and then locally assigns a specific category in each aspect for every triple. Expand
Translating Embeddings for Modeling Multi-relational Data
TLDR
TransE is proposed, a method which models relationships by interpreting them as translations operating on the low-dimensional embeddings of the entities, which proves to be powerful since extensive experiments show that TransE significantly outperforms state-of-the-art methods in link prediction on two knowledge bases. Expand
Type-Constrained Representation Learning in Knowledge Graphs
TLDR
This work integrated prior knowledge in form of type-constraints in various state of the art latent variable approaches and shows that prior knowledge on relation-types significantly improves these models up to 77% in link-prediction tasks. Expand
You CAN Teach an Old Dog New Tricks! On Training Knowledge Graph Embeddings
TLDR
It is found that when trained appropriately, the relative performance differences between various model architectures often shrinks and sometimes even reverses when compared to prior results, and many of the more advanced architectures and techniques proposed in the literature should be revisited to reassess their individual benefits. Expand
ProjE: Embedding Projection for Knowledge Graph Completion
TLDR
This work presents a shared variable neural network model called ProjE that fills-in missing information in a knowledge graph by learning joint embeddings of the knowledge graph's entities and edges, and through subtle, but important, changes to the standard loss function. Expand
...
1
2
3
4
...