• Corpus ID: 235422273

Neural Bellman-Ford Networks: A General Graph Neural Network Framework for Link Prediction

  title={Neural Bellman-Ford Networks: A General Graph Neural Network Framework for Link Prediction},
  author={Zhaocheng Zhu and Zuobai Zhang and Louis-Pascal Xhonneux and Jian Tang},
  booktitle={Neural Information Processing Systems},
Link prediction is a very fundamental task on graphs. Inspired by traditional path-based methods, in this paper we propose a general and flexible representation learning framework based on paths for link prediction. Specifically, we define the representation of a pair of nodes as the generalized sum of all path representations between the nodes, with each path representation as the generalized product of the edge representations in the path. Motivated by the Bellman-Ford algorithm for solving the… 

ReFactorGNNs: Revisiting Factorisation-based Models from a Message-Passing Perspective

This work bridges the gap between FMs and GNNs by proposing R E F ACTOR GNN S, a new architecture that achieves comparable transductive performance to FMs, and state-of-the-art inductive performance while using an order of magnitude fewer parameters.

Two-Dimensional Weisfeiler-Lehman Graph Neural Networks for Link Prediction

A completely different approach which can directly obtain node pair (link) representations based on two-dimensional Weisfeiler-Lehman ( 2 -WL) tests is studied, and the power of using 2-WL tests to directly obtain link representations is demonstrated.

Neural-Symbolic Models for Logical Queries on Knowledge Graphs

Experiments show that GNN-QE significantly improves over previous state-of-the-art models in answering FOL queries, and can predict the number of answers without explicit supervision, and provide visualizations for intermediate variables.

Learning Adaptive Propagation for Knowledge Graph Reasoning

This paper revisits exemplar works from the lens of the propagation path and designs an incremental sampling mechanism where the close and promising target can be preserved and a learning-based sampling distribution to identify the targets with fewer involved entities.

Lifelong Embedding Learning and Transfer for Growing Knowledge Graphs

Experimental results show that the proposed model outperforms the state-of-the-art inductive and lifelong embedding baselines.

EurNet: Efficient Multi-Range Relational Modeling of Spatial Multi-Relational Data

The results demonstrate the strength of EurNets on modeling spatial multi-relational data from various domains and follow the augmentation functions and mixup strategies used in Swin Transformer.

Instance-based Learning for Knowledge Base Completion

Surprisingly, despite occupying only a small portion of the rule space, IBL rules outperform non-IBL rules in all four benchmarks and provide new insights of how rule-based models work and how to interpret their rules.

DPB-NBFnet: Using neural Bellman-Ford networks to predict DNA-protein binding

This work put the latest Neural Bellman-Ford neural networks (NBFnets) into use to build pair representations of DNA and protein to predict the existence of DNA-protein binding (DPB), and indicates that the performance of DPB-NBFnet is competitive when compared with the baseline models.

Few-shot Relational Reasoning via Connection Subgraph Pretraining

The key to CSR is that it explicitly model a shared connection subgraph between support and query triplets, as inspired by the principle of eliminative induction, which can make predictions for the target few-shot task directly without the need for pre-training on the human curated set of training tasks.

Linkless Link Prediction via Relational Distillation

This work proposes a relational KD framework, Linkless Link Prediction (LLP), which boosts the link prediction performance of MLPs with significant margins, and even outperforms the teacher GNNs on 6 out of 9 benchmarks.



Principal Neighbourhood Aggregation for Graph Nets

This work proposes Principal Neighbourhood Aggregation (PNA), a novel architecture combining multiple aggregators with degree-scalers (which generalize the sum aggregator) and compares the capacity of different models to capture and exploit the graph structure via a novel benchmark containing multiple tasks taken from classical graph theory.

Inductive Relation Prediction by Subgraph Reasoning

A graph neural network based relation prediction framework, GraIL, that reasons over local subgraph structures and has a strong inductive bias to learn entity-independent relational semantics is proposed.

Entity Context and Relational Paths for Knowledge Graph Completion

Developing PathCon, a knowledge graph completion method that harnesses four novel insights to outperform existing methods and provide interpretable explanations by identifying relations that provide the context and paths that are important for a given predicted relation.

DRUM: End-To-End Differentiable Rule Mining On Knowledge Graphs

DRUM is proposed, a scalable and differentiable approach for mining first-order logical rules from knowledge graphs that resolves the problem of learning probabilistic logical rules for inductive and interpretable link prediction.

A new status index derived from sociometric analysis

A new method of computation which takes into account who chooses as well as how many choose is presented, which introduces the concept of attenuation in influence transmitted through intermediaries.

How Powerful are Graph Neural Networks?

This work characterize the discriminative power of popular GNN variants, such as Graph Convolutional Networks and GraphSAGE, and show that they cannot learn to distinguish certain simple graph structures, and develops a simple architecture that is provably the most expressive among the class of GNNs.

Graph Attention Networks

We present graph attention networks (GATs), novel neural network architectures that operate on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of prior

The link-prediction problem for social networks

Experiments on large coauthorship networks suggest that information about future interactions can be extracted from network topology alone, and that fairly subtle measures for detecting node proximity can outperform more direct measures.

Differentiable Learning of Logical Rules for Knowledge Base Reasoning

A framework, Neural Logic Programming, is proposed that combines the parameter and structure learning of first-order logical rules in an end-to-end differentiable model and outperforms prior work on multiple knowledge base benchmark datasets, including Freebase and WikiMovies.