The Power of the Weisfeiler-Leman Algorithm for Machine Learning with Graphs

@article{Morris2021ThePO,
  title={The Power of the Weisfeiler-Leman Algorithm for Machine Learning with Graphs},
  author={Christopher Morris and Matthias Fey and Nils M. Kriege},
  journal={ArXiv},
  year={2021},
  volume={abs/2105.05911}
}
In recent years, algorithms and neural architectures based on the Weisfeiler-Leman algorithm, a well-known heuristic for the graph isomorphism problem, emerged as a powerful tool for (supervised) machine learning with graphs and relational data. Here, we give a comprehensive overview of the algorithm's use in a machine learning setting. We discuss the theoretical background, show how to use it for supervised graph- and node classification, discuss recent extensions, and its connection to neural… 

Figures from this paper

A systematic approach to random data augmentation on graph neural networks
TLDR
A new comprehensive framework is proposed that captures all previous RDA techniques and formally proves that under natural conditions all instantiations of this framework are universal.
Graph Filtration Kernels
TLDR
This work proposes a family of graph kernels that incorporate existence intervals of features and empirically validate the expressive power of the graph kernels and show significant improvements over state-of-the-art graph kernels in terms of predictive performance on various real-world benchmark datasets.
Reconstruction for Powerful Graph Representations
TLDR
This work shows the extent to which graph reconstruction—reconstructing a graph from its subgraphs—can mitigate the theoretical and practical problems currently faced by GRL architectures and demonstrates how it boosts state-of-the-art GNN’s performance across nine real-world benchmark datasets.

References

SHOWING 1-10 OF 115 REFERENCES
A Persistent Weisfeiler-Lehman Procedure for Graph Classification
TLDR
This work leverages propagated node label information and transform unweighted graphs into metric ones to augment the subtree features with topological information obtained using persistent homology, a concept from topological data analysis.
Weisfeiler and Leman Go Neural: Higher-order Graph Neural Networks
TLDR
It is shown that GNNs have the same expressiveness as the Weisfeiler-Leman graph isomorphism heuristic in terms of distinguishing non-isomorphic (sub-)graphs, and a generalization of GNN's is proposed, so-called $k$-dimensional GNNS ($k-GNNs), which can take higher-order graph structures at multiple scales into account.
Coloring graph neural networks for node disambiguation
TLDR
This paper introduces a graph neural network called Colored Local Iterative Procedure (CLIP) that uses colors to disambiguate identical node attributes, and shows that this representation is a universal approximator of continuous functions on graphs with node attributes.
Provably Powerful Graph Networks
TLDR
This paper proposes a simple model that interleaves applications of standard Multilayer-Perceptron (MLP) applied to the feature dimension and matrix multiplication and shows that a reduced 2-order network containing just scaled identity operator, augmented with a single quadratic operation (matrix multiplication) has a provable 3-WL expressive power.
Weisfeiler-Lehman Neural Machine for Link Prediction
In this paper, we propose a next-generation link prediction method, Weisfeiler-Lehman Neural Machine (WLNM), which learns topological features in the form of graph patterns that promote the formation
Characterizing the Expressive Power of Invariant and Equivariant Graph Neural Networks
TLDR
It is proved that the first approximation guarantees for practical GNNs are proved, paving the way for a better understanding of their generalization.
Inductive Representation Learning on Large Graphs
TLDR
GraphSAGE is presented, a general, inductive framework that leverages node feature information (e.g., text attributes) to efficiently generate node embeddings for previously unseen data and outperforms strong baselines on three inductive node-classification benchmarks.
Wasserstein Weisfeiler-Lehman Graph Kernels
TLDR
A novel method that relies on the Wasserstein distance between the node feature vector distributions of two graphs, which allows to find subtler differences in data sets by considering graphs as high-dimensional objects, rather than simple means is proposed.
Scalable classification for large dynamic networks
  • Yibo Yao, L. Holder
  • Computer Science
    2015 IEEE International Conference on Big Data (Big Data)
  • 2015
TLDR
An online version of an existing graph kernel is introduced to incrementally compute the kernel matrix for a unbounded stream of extracted subgraphs and a kernel perceptron is adopted to learn a discriminative classifier and predict the class labels of the target nodes with their corresponding sub graphs.
Dynamic Edge-Conditioned Filters in Convolutional Neural Networks on Graphs
TLDR
This work generalizes the convolution operator from regular grids to arbitrary graphs while avoiding the spectral domain, which allows us to handle graphs of varying size and connectivity.
...
1
2
3
4
5
...