Improving Inductive Link Prediction Using Hyper-Relational Facts

@inproceedings{Ali2021ImprovingIL,
  title={Improving Inductive Link Prediction Using Hyper-Relational Facts},
  author={Mehdi Ali and Max Berrendorf and Mikhail Galkin and Veronika Thost and Tengfei Ma and Volker Tresp and Jens Lehmann},
  booktitle={SEMWEB},
  year={2021}
}
For many years, link prediction on knowledge. graphs has been a purely transductive task, not allowing for reasoning on unseen entities. Recently, increasing efforts are put into exploring semi- and fully inductive scenarios, enabling inference over unseen and emerging entities. Still, all these approaches only consider triple-based KGs, whereas their richer counterparts, hyper-relational KGs (e.g., Wikidata), have not yet been properly studied. In this work, we classify different inductive… 

A Few-Shot Inductive Link Prediction Model in Knowledge Graphs

TLDR
A new inductive link prediction model MILP is proposed, which uses meta-learning to predict unseen entities on few-shot data and achieves better results than the baseline models, proving the effectiveness of MILP.

Facing Changes: Continual Entity Alignment for Growing Knowledge Graphs

TLDR
A continual alignment method that reconstructs an entity’s representation based on entity adjacency, enabling it to generate embeddings for new entities quickly and inductively using their existing neighbors, and is more e-ective than baselines based on retraining or inductive learning.

An Open Challenge for Inductive Link Prediction on Knowledge Graphs

TLDR
This work introduces ILPC 2022, a novel open challenge on KG inductive link prediction, and constructed two new datasets based on Wikidata with various sizes of training and inference graphs that are much larger than existing inductive benchmarks.

Low-resource Learning with Knowledge Graphs: A Comprehensive Survey

TLDR
This survey very comprehensively reviewed over 90 papers about KG-aware research for two major low-resource learning settings — zero- shot learning (ZSL) where new classes for prediction have never appeared in training, and few-shot learning (FSL)Where new classesFor prediction have only a small number of labeled samples that are available.

References

SHOWING 1-10 OF 35 REFERENCES

Out-of-Sample Representation Learning for Knowledge Graphs

TLDR
This paper study the out-of-sample representation learning problem for non-attributed knowledge graphs, create benchmark datasets for this task, develop several models and baselines, and provide empirical analyses and comparisons of the proposed models and Baselines.

Inductive Entity Representations from Text via Link Prediction

TLDR
This work proposes a holistic evaluation protocol for entity representations learned via a link prediction objective, and evaluates an architecture based on a pretrained language model, that exhibits strong generalization to entities not observed during training, and outperforms related state-of-the-art methods.

Inductive Learning on Commonsense Knowledge Graph Completion

TLDR
A novel learning framework named InductivE is developed that densifies CKGs by adding edges among semantic-related entities and provide more supportive information for unseen entities, leading to better generalization ability of entity embedding forseen entities.

MLMLM: Link Prediction with Mean Likelihood Masked Language Model

TLDR
This work introduces MLMLM, Mean Likelihood Masked Language Model, an approach comparing the mean likelihood of generating the different entities to perform link prediction in a tractable manner, and obtains convincing results on link prediction on previously unseen entities.

Learning to Extrapolate Knowledge: Transductive Few-shot Out-of-Graph Link Prediction

TLDR
This work introduces a realistic problem of few-shot out-of-graph link prediction, where the links between the seen and unseen nodes as in a conventional out- of-knowledge link prediction but also between the unseen nodes, with only few edges per node.

Inductive Relation Prediction by Subgraph Reasoning

TLDR
A graph neural network based relation prediction framework, GraIL, that reasons over local subgraph structures and has a strong inductive bias to learn entity-independent relational semantics is proposed.

Composition-based Multi-Relational Graph Convolutional Networks

TLDR
This paper proposes CompGCN, a novel Graph Convolutional framework which jointly embeds both nodes and relations in a relational graph and leverages a variety of entity-relation composition operations from Knowledge Graph Embedding techniques and scales with the number of relations.

A Three-Way Model for Collective Learning on Multi-Relational Data

TLDR
This work presents a novel approach to relational learning based on the factorization of a three-way tensor that is able to perform collective learning via the latent components of the model and provide an efficient algorithm to compute the factorizations.

pages 798–808

  • ACM / IW3C2,
  • 2021

In The Semantic Web – ISWC 2021

  • pages 74–92, Cham,
  • 2021