Corpus ID: 202775885

DRUM: End-To-End Differentiable Rule Mining On Knowledge Graphs

@inproceedings{Sadeghian2019DRUMED,
  title={DRUM: End-To-End Differentiable Rule Mining On Knowledge Graphs},
  author={Ali Sadeghian and Mohammadreza Armandpour and Patrick Ding and Daisy Zhe Wang},
  booktitle={NeurIPS},
  year={2019}
}
In this paper, we study the problem of learning probabilistic logical rules for inductive and interpretable link prediction. Despite the importance of inductive link prediction, most previous works focused on transductive link prediction and cannot manage previously unseen entities. Moreover, they are black-box models that are not easily explainable for humans. We propose DRUM, a scalable and differentiable approach for mining first-order logical rules from knowledge graphs that resolves these… Expand

Figures, Tables, and Topics from this paper

Inductive Relation Prediction by Subgraph Reasoning
TLDR
A graph neural network based relation prediction framework, GraIL, that reasons over local subgraph structures and has a strong inductive bias to learn entity-independent relational semantics is proposed. Expand
SAFRAN: An interpretable, rule-based link prediction method outperforming embedding models
TLDR
The SAFRAN rule application framework is introduced, which uses a novel aggregation approach called Non-redundant Noisy-OR that detects and clusters redundant rules prior to aggregation that yields new state-of-the-art results for fully interpretable link prediction on the established generalpurpose benchmarks FB15K-237, WN18RR and YAGO3-10. Expand
Evaluating Logical Generalization in Graph Neural Networks
TLDR
The ability for models to generalize and adapt is strongly determined by the diversity of the logical rules they encounter during training, and the results highlight new challenges for the design of GNN models. Expand
Subgraph-aware Few-Shot Inductive Link Prediction via Meta-Learning
  • Shuangjia Zheng, Sijie Mai, Ya Sun, Haifeng Hu, Yuedong Yang
  • Computer Science
  • ArXiv
  • 2021
TLDR
Meta-iKG is proposed, a novel subgraph-based meta-learner for few-shot inductive relation reasoning that utilizes local subgraphs to transfer sub graph-specific information and learn transferable patterns faster via meta gradients and is evaluated on inductive benchmarks sampled from NELL and Freebase. Expand
Retrofitting Soft Rules for Knowledge Representation Learning
TLDR
A retrofit framework that iteratively enhances the knowledge representation and confidence of soft rules and achieves state-of-the-art results on link prediction and triple classification tasks, brought by the fine-tuned soft rules. Expand
A Topological View of Rule Learning in Knowledge Graphs
  • Zuoyu Yan, Tengfei Ma, Liangcai Gao, Zhi Tang, Chao Chen
  • Computer Science
  • ArXiv
  • 2021
TLDR
A novel GNN framework is built on the collected cycles to learn the representations of cycles, and to predict the existence/non-existence of a relation, and achieves state-of-the-art performance on benchmarks. Expand
Learning First-Order Rules with Relational Path Contrast for Inductive Relation Reasoning
  • Yudai Pan, Jun Liu, Lingling Zhang, Xin Hu, Tianzhe Zhao, Qika Lin
  • Computer Science
  • ArXiv
  • 2021
TLDR
A novel graph convolutional network (GCN)-based approach for interpretable inductive reasoning with relational path contrast, named RPC-IR, which achieves outstanding performance comparing with the latest induction reasoning methods and could explicitly represent logical rules for interpretability. Expand
Topology-Aware Correlations Between Relations for Inductive Link Prediction in Knowledge Graphs
TLDR
TACT is inspired by the observation that the semantic correlation between two relations is highly correlated to their topological structure in knowledge graphs and categorizes all relation pairs into several topological patterns, and proposes a Relational Correlation Network (RCN) to learn the importance of the different patterns for inductive link prediction. Expand
Combining Rules and Embeddings via Neuro-Symbolic AI for Knowledge Base Completion
TLDR
It is shown that not all rule-based KBC models are the same and two distinct approaches that learn in one case: 1) a mixture of relations and the other 2)A mixture of paths are proposed. Expand
Embedding Knowledge Graphs Attentive to Positional and Centrality Qualities
TLDR
This work proposes a novel KGE method named Graph Feature Attentive Neural Network (GFA-NN) that computes graphical features of entities that achieves on-par or better results than state-of-the-art KGE solutions. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 49 REFERENCES
Modeling Relational Data with Graph Convolutional Networks
TLDR
It is shown that factorization models for link prediction such as DistMult can be significantly improved through the use of an R-GCN encoder model to accumulate evidence over multiple inference steps in the graph, demonstrating a large improvement of 29.8% on FB15k-237 over a decoder-only baseline. Expand
Complex Embeddings for Simple Link Prediction
TLDR
This work makes use of complex valued embeddings to solve the link prediction problem through latent factorization, and uses the Hermitian dot product, the complex counterpart of the standard dot product between real vectors. Expand
Temporal Reasoning Over Event Knowledge Graphs
Many advances in the computer science field, such as semantic search, recommendation systems, question-answering, natural language processing, are drawn-out using the help of large scale knowledgeExpand
Scalable Rule Learning via Learning Representation
TLDR
This paper presents a new approach RLvLR to learning rules from KGs by using the technique of embedding in representation learning together with a new sampling method, and shows that this system outperforms some state-of-the-art systems. Expand
Relational Representation Learning for Dynamic (Knowledge) Graphs: A Survey
TLDR
This survey describes existing models for representation learning for dynamic graphs from an encoder-decoder perspective, categorize these encoders and decoders based on the techniques they employ, and analyze the approaches in each category. Expand
Traversing Knowledge Graphs in Vector Space
TLDR
It is demonstrated that compositional training acts as a novel form of structural regularization, reliably improving performance across all base models (reducing errors by up to 43%) and achieving new state-of-the-art results. Expand
Reasoning With Neural Tensor Networks for Knowledge Base Completion
TLDR
An expressive neural tensor network suitable for reasoning over relationships between two entities given a subset of the knowledge base is introduced and performance can be improved when entities are represented as an average of their constituting word vectors. Expand
Structure Learning via Parameter Learning
TLDR
This paper presents a novel structure-learning method for a new, scalable probabilistic logic called ProPPR that builds on the recent success of meta-interpretive learning methods in Inductive Logic Programming and extends it to a framework that enables robust and efficient structure learning of logic programs on graphs. Expand
Translating Embeddings for Modeling Multi-relational Data
TLDR
TransE is proposed, a method which models relationships by interpreting them as translations operating on the low-dimensional embeddings of the entities, which proves to be powerful since extensive experiments show that TransE significantly outperforms state-of-the-art methods in link prediction on two knowledge bases. Expand
Embedding Entities and Relations for Learning and Inference in Knowledge Bases
TLDR
It is found that embeddings learned from the bilinear objective are particularly good at capturing relational semantics and that the composition of relations is characterized by matrix multiplication. Expand
...
1
2
3
4
5
...