• Corpus ID: 245334481

DegreEmbed: incorporating entity embedding into logic rule learning for knowledge graph reasoning

@article{Wei2021DegreEmbedIE,
  title={DegreEmbed: incorporating entity embedding into logic rule learning for knowledge graph reasoning},
  author={Yuliang Wei and Haotian Li and Yao Wang and Guodong Xin and Hongri Liu},
  journal={ArXiv},
  year={2021},
  volume={abs/2112.09933}
}
. Knowledge graphs (KGs), as structured representations of real world facts, are intelligent databases incorporating human knowledge that can help machine imitate the way of human problem solving. However, KGs are usually huge and there are inevitably missing facts in KGs, thus undermining applications such as question answering and recommender systems that are based on knowledge graph reasoning. Link prediction for knowledge graphs is the task aiming to complete missing facts by reasoning… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 43 REFERENCES

Probabilistic Logic Neural Networks for Reasoning

The probabilistic Logic Neural Network (pLogicNet), which combines the advantages of both methods and defines the joint distribution of all possible triplets by using a Markov logic network with first-order logic, which can be efficiently optimized with the variational EM algorithm.

Iteratively Learning Embeddings and Rules for Knowledge Graph Reasoning

A novel framework IterE iteratively learning embeddings and rules is proposed, in which rules are learned fromembeddings with proper pruning strategy and embeddins are learning from existing triples and new triples inferred by rules.

A Survey on Knowledge Graphs: Representation, Acquisition, and Applications

A comprehensive review of the knowledge graph covering overall research topics about: 1) knowledge graph representation learning; 2) knowledge acquisition and completion; 3) temporal knowledge graph; and 4) knowledge-aware applications and summarize recent breakthroughs and perspective directions to facilitate future research.

Differentiable learning of numerical rules in knowledge graphs

This work extends Neural LP to learn rules with numerical values and extracts more expressive rules with aggregates, which are of higher quality and yield more accurate predictions compared to rules learned by the state-of-the-art methods, as shown by the experiments on synthetic and real-world datasets.

Traversing Knowledge Graphs in Vector Space

It is demonstrated that compositional training acts as a novel form of structural regularization, reliably improving performance across all base models (reducing errors by up to 43%) and achieving new state-of-the-art results.

Compositional Vector Space Models for Knowledge Base Completion

This paper presents an approach that reasons about conjunctions of multi-hop relations non-atomically, composing the implications of a path using a recurrent neural network (RNN) that takes as inputs vector embeddings of the binary relation in the path.

DRUM: End-To-End Differentiable Rule Mining On Knowledge Graphs

DRUM is proposed, a scalable and differentiable approach for mining first-order logical rules from knowledge graphs that resolves the problem of learning probabilistic logical rules for inductive and interpretable link prediction.

Knowledge Aware Conversation Generation with Explainable Reasoning over Augmented Graphs

This work proposes a knowledge aware chatting machine with three components, an augmented knowledge graph with both triples and texts, knowledge selector, and knowledge aware response generator, and improves a state of the art reasoning algorithm with machine reading comprehension technology for knowledge selection on the graph.

Embedding Entities and Relations for Learning and Inference in Knowledge Bases

It is found that embeddings learned from the bilinear objective are particularly good at capturing relational semantics and that the composition of relations is characterized by matrix multiplication.

RNNLogic: Learning Logic Rules for Reasoning on Knowledge Graphs

An EM-based algorithm for optimization of a probabilistic model called RNNLogic, which treats logic rules as a latent variable, and simultaneously trains a rule generator as well as a reasoning predictor with logic rules.