# On the Ability of Graph Neural Networks to Model Interactions Between Vertices

@article{Razin2022OnTA, title={On the Ability of Graph Neural Networks to Model Interactions Between Vertices}, author={Noam Razin and Tom Verbin and Nadav Cohen}, journal={ArXiv}, year={2022}, volume={abs/2211.16494} }

Graph neural networks (GNNs) are widely used for modeling complex interactions between entities represented as vertices of a graph. Despite recent efforts to theoretically analyze the expressive power of GNNs, a formal characterization of their ability to model interactions is lacking. The current paper aims to address this gap. Formalizing strength of interactions through an established measure known as separation rank , we quantify the ability of certain GNNs to model interaction between a…

## One Citation

### What Makes Data Suitable for a Locally Connected Neural Network? A Necessary and Sufficient Condition Based on Quantum Entanglement

- Computer Science
- 2023

This work states that a certain locally connected neural network is capable of accurate prediction over a data distribution if and only if the data distribution admits low quantum entanglement under certain canonical partitions of features.

## References

SHOWING 1-10 OF 108 REFERENCES

### How hard is to distinguish graphs with graph neural networks?

- Computer ScienceNeurIPS
- 2020

This study derives hardness results for the classification variant of graph isomorphism in the message-passing model (MPNN), which encompasses the majority of graph neural networks used today and is universal when nodes are given unique features.

### On the Bottleneck of Graph Neural Networks and its Practical Implications

- Computer ScienceICLR
- 2021

It is shown that existing, extensively-tuned, GNN-based models suffer from over-squashing and that breaking the bottleneck improves state-of-the-art results without any hyperparameter tuning or additional weights.

### Improving Graph Neural Network Expressivity via Subgraph Isomorphism Counting

- Computer ScienceIEEE Transactions on Pattern Analysis and Machine Intelligence
- 2023

The expressive power of the proposed Graph Substructure Networks (GSN), a topologically-aware message passing scheme based on substructure encoding, is theoretically analysed, showing that it is strictly more expressive than the WL test, and provide sufficient conditions for universality.

### How Powerful are Graph Neural Networks?

- Computer ScienceICLR
- 2019

This work characterize the discriminative power of popular GNN variants, such as Graph Convolutional Networks and GraphSAGE, and show that they cannot learn to distinguish certain simple graph structures, and develops a simple architecture that is provably the most expressive among the class of GNNs.

### Graph Neural Networks with Local Graph Parameters

- Computer ScienceNeurIPS
- 2021

This work describes local graph parameter enabled GNN s as a framework for studying “higher-order” GNNs and their higher-order counterparts, and precisely characterize their distinguishing power, in terms of a variant of the WL test, and of the graph structural properties that they can take into account.

### The Logical Expressiveness of Graph Neural Networks

- Computer ScienceICLR
- 2020

The ability of graph neural networks for distinguishing nodes in graphs has been recently characterized in terms of the Weisfeiler-Lehman (WL) test, but this work focuses on Boolean classifiers expressible as formulas in the logic FOC2, a well-studied fragment of first order logic.

### On the equivalence between graph isomorphism testing and function approximation with GNNs

- Computer Science, MathematicsNeurIPS
- 2019

It is proved that order-2 Graph G-invariant networks fail to distinguish non-isomorphic regular graphs with the same degree, and is extended to a new architecture, Ring-GNNs, which succeeds on distinguishing these graphs and provides improvements on real-world social network datasets.

### On Graph Neural Networks versus Graph-Augmented MLPs

- Computer ScienceICLR
- 2021

This work compares multi-layer Graph Neural Networks with a simplified alternative that is called Graph-Augmented Multi-Layer Perceptrons (GA-MLPs), which first augments node features with certain multi-hop operators on the graph and then applies an MLP in a node-wise fashion.

### Can graph neural networks count substructures?

- Computer ScienceNeurIPS
- 2020

A local relational pooling approach with inspirations from Murphy et al. (2019) is proposed and demonstrated that it is not only effective for substructure counting but also able to achieve competitive performance on real-world tasks.

### Analyzing the Expressive Power of Graph Neural Networks in a Spectral Perspective

- Computer ScienceICLR
- 2021

It is argued that a spectral analysis of GNNs behavior can provide a complementary point of view to go one step further in the understanding of Gnns, and theoretically demonstrate some equivalence of the graph convolution process regardless of whether it is designed in the spatial or the spectral domain.