Graph Neural Networks Are More Powerful Than we Think

@article{Kanatsoulis2022GraphNN,
  title={Graph Neural Networks Are More Powerful Than we Think},
  author={Charilaos I. Kanatsoulis and Alejandro Ribeiro},
  journal={ArXiv},
  year={2022},
  volume={abs/2205.09801}
}
Graph Neural Networks (GNNs) are powerful convolutional architectures that have shown remarkable performance in various node-level and graph-level tasks. Despite their success, the common belief is that the expressive power of GNNs is limited and that they are at most as discriminative as the Weisfeiler-Lehman (WL) algorithm. In this paper we argue the opposite and show that the WL algorithm is the upper bound only when the input to the GNN is the vector of all ones. In this direction, we… 
Learning by Transference: Training Graph Neural Networks on Growing Graphs
TLDR
A novel algorithm to learn GNNs on large- scale graphs that, starting from a moderate number of nodes, successively increases the size of the graph during training is proposed and benchmarked on a decentralized control problem, where it retains comparable performance to its large-scale counterpart at a reduced computational cost.

References

SHOWING 1-10 OF 49 REFERENCES
The Surprising Power of Graph Neural Networks with Random Node Initialization
TLDR
This paper proves that GNNs with RNI are universal, a first such result for GNN's not relying on computationally demanding higher-order properties, and empirically analyzes the effect of RNI on GNN’s, finding that the empirical findings support the superior performance of GNNS with R NI over standard Gnns.
Provably Powerful Graph Networks
TLDR
This paper proposes a simple model that interleaves applications of standard Multilayer-Perceptron (MLP) applied to the feature dimension and matrix multiplication and shows that a reduced 2-order network containing just scaled identity operator, augmented with a single quadratic operation (matrix multiplication) has a provable 3-WL expressive power.
Weisfeiler and Leman Go Neural: Higher-order Graph Neural Networks
TLDR
It is shown that GNNs have the same expressiveness as the Weisfeiler-Leman graph isomorphism heuristic in terms of distinguishing non-isomorphic (sub-)graphs, and a generalization of GNN's is proposed, so-called $k$-dimensional GNNS ($k-GNNs), which can take higher-order graph structures at multiple scales into account.
Expressive Power of Invariant and Equivariant Graph Neural Networks
TLDR
It is proved that the first approximation guarantees for practical GNNs are proved, paving the way for a better understanding of their generalization.
Random Features Strengthen Graph Neural Networks
TLDR
It is proved that the random features enable GNNs to learn almost optimal polynomial-time approximation algorithms for the minimum dominating set problem and maximum matching problem in terms of the approximation ratio.
Benchmarking Graph Neural Networks
TLDR
A reproducible GNN benchmarking framework is introduced, with the facility for researchers to add new models conveniently for arbitrary datasets, and a principled investigation into the recent Weisfeiler-Lehman GNNs (WL-GNNs) compared to message passing-based graph convolutional networks (GCNs).
Graphon Neural Networks and the Transferability of Graph Neural Networks
TLDR
This paper introduces graphon NNs as limit objects of GNNs and proves a bound on the difference between the output of a GNN and its limit graphon-NN if the graph convolutional filters are bandlimited in the graph spectral domain.
Stability Properties of Graph Neural Networks
TLDR
This work proves that graph convolutions with integral Lipschitz filters, in combination with the frequency mixing effect of the corresponding nonlinearities, yields an architecture that is both stable to small changes in the underlying topology, and discriminative of information located at high frequencies.
On the equivalence between graph isomorphism testing and function approximation with GNNs
TLDR
It is proved that order-2 Graph G-invariant networks fail to distinguish non-isomorphic regular graphs with the same degree, and is extended to a new architecture, Ring-GNNs, which succeeds on distinguishing these graphs and provides improvements on real-world social network datasets.
Relational Pooling for Graph Representations
TLDR
This work generalizes graph neural networks (GNNs) beyond those based on the Weisfeiler-Lehman (WL) algorithm, graph Laplacians, and diffusions to provide a framework with maximal representation power for graphs.
...
...