Random Features Strengthen Graph Neural Networks

@inproceedings{Sato2021RandomFS,
  title={Random Features Strengthen Graph Neural Networks},
  author={R. Sato and Makoto Yamada and Hisashi Kashima},
  booktitle={SDM},
  year={2021}
}
Graph neural networks (GNNs) are powerful machine learning models for various graph learning tasks. Recently, the limitations of the expressive power of various GNN models have been revealed. For example, GNNs cannot distinguish some non-isomorphic graphs and they cannot learn efficient graph algorithms, and several GNN models have been proposed to overcome these limitations. In this paper, we demonstrate that GNNs become powerful just by adding a random feature to each node. We prove that the… 

Figures and Tables from this paper

THE SURPRISING POWER OF GRAPH NEURAL NET-
  • Computer Science
  • 2020
TLDR
This paper proves that GNNs with RNI are universal, a first such result for GNN’s not relying on computationally demanding higher-order properties, and empirically analyzes the effect of RNI onGNNs, finding that the empirical findings support the superior performance of GNNS with R NI over standard Gnns.
A Survey on The Expressive Power of Graph Neural Networks
TLDR
This survey provides a comprehensive overview of the expressive power of GNNs and provably powerful variants ofGNNs.
On Graph Neural Networks versus Graph-Augmented MLPs
TLDR
This work compares multi-layer Graph Neural Networks with a simplified alternative that is called Graph-Augmented Multi-Layer Perceptrons (GA-MLPs), which first augments node features with certain multi-hop operators on the graph and then applies an MLP in a node-wise fashion.
How hard is to distinguish graphs with graph neural networks?
TLDR
This study derives hardness results for the classification variant of graph isomorphism in the message-passing model (MPNN), which encompasses the majority of graph neural networks used today and is universal when nodes are given unique features.
Generalization and Representational Limits of Graph Neural Networks
TLDR
This work proves that several important graph properties cannot be computed by GNNs that rely entirely on local information, and provides the first data dependent generalization bounds for message passing Gnns.
IMPROVING GRAPH NEURAL NETWORK EXPRESSIV-
TLDR
This work proposes a novel topologically-aware message passing scheme based on substructure encoding that allows incorporating domain-specific inductive biases and is strictly more expressive than the Weisfeiler-Lehman graph isomorphism test.
How hard is graph isomorphism for graph neural networks?
TLDR
This study derives the first hardness results for graph isomorphism in the message-passing model (MPNN), which encompasses the majority of graph neural networks used today and is universal in the limit when nodes are given unique features.
The Expressive Power of Graph Neural Networks as a Query Language
TLDR
It is proved that the unary FOC2 formulas that can be captured by an AC-GNN are exactly those that can been expressed in its guarded fragment, which in turn corresponds to graded modal logic.
Principal Neighbourhood Aggregation for Graph Nets
TLDR
This work proposes Principal Neighbourhood Aggregation (PNA), a novel architecture combining multiple aggregators with degree-scalers (which generalize the sum aggregator) and compares the capacity of different models to capture and exploit the graph structure via a novel benchmark containing multiple tasks taken from classical graph theory.
Building powerful and equivariant graph neural networks with message-passing
TLDR
This work proposes a new message-passing framework that is powerful while preserving permutation equivariance, and propagates unique node identifiers in the form of a one-hot encoding in order to learn a local context around each node.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 62 REFERENCES
How Powerful are Graph Neural Networks?
TLDR
This work characterize the discriminative power of popular GNN variants, such as Graph Convolutional Networks and GraphSAGE, and show that they cannot learn to distinguish certain simple graph structures, and develops a simple architecture that is provably the most expressive among the class of GNNs.
On the equivalence between graph isomorphism testing and function approximation with GNNs
TLDR
It is proved that order-2 Graph G-invariant networks fail to distinguish non-isomorphic regular graphs with the same degree, and is extended to a new architecture, Ring-GNNs, which succeeds on distinguishing these graphs and provides improvements on real-world social network datasets.
The Logical Expressiveness of Graph Neural Networks
TLDR
The ability of graph neural networks for distinguishing nodes in graphs has been recently characterized in terms of the Weisfeiler-Lehman (WL) test, but this work focuses on Boolean classifiers expressible as formulas in the logic FOC2, a well-studied fragment of first order logic.
Weisfeiler and Leman Go Neural: Higher-order Graph Neural Networks
TLDR
It is shown that GNNs have the same expressiveness as the Weisfeiler-Leman graph isomorphism heuristic in terms of distinguishing non-isomorphic (sub-)graphs, and a generalization of GNN's is proposed, so-called $k$-dimensional GNNS ($k-GNNs), which can take higher-order graph structures at multiple scales into account.
What graph neural networks cannot learn: depth vs width
TLDR
GNNmp are shown to be Turing universal under sufficient conditions on their depth, width, node attributes, and layer expressiveness, and it is discovered that GNNmp can lose a significant portion of their power when their depth and width is restricted.
Provably Powerful Graph Networks
TLDR
This paper proposes a simple model that interleaves applications of standard Multilayer-Perceptron (MLP) applied to the feature dimension and matrix multiplication and shows that a reduced 2-order network containing just scaled identity operator, augmented with a single quadratic operation (matrix multiplication) has a provable 3-WL expressive power.
The Graph Neural Network Model
TLDR
A new neural network model, called graph neural network (GNN) model, that extends existing neural network methods for processing the data represented in graph domains, and implements a function tau(G,n) isin IRm that maps a graph G and one of its nodes n into an m-dimensional Euclidean space.
Gated Graph Sequence Neural Networks
TLDR
This work studies feature learning techniques for graph-structured inputs and achieves state-of-the-art performance on a problem from program verification, in which subgraphs need to be matched to abstract data structures.
Relational Pooling for Graph Representations
TLDR
This work generalizes graph neural networks (GNNs) beyond those based on the Weisfeiler-Lehman (WL) algorithm, graph Laplacians, and diffusions to provide a framework with maximal representation power for graphs.
A new model for learning in graph domains
TLDR
A new neural model, called graph neural network (GNN), capable of directly processing graphs, which extends recursive neural networks and can be applied on most of the practically useful kinds of graphs, including directed, undirected, labelled and cyclic graphs.
...
1
2
3
4
5
...