• Corpus ID: 252780462

Graph Neural Networks as Gradient Flows: understanding graph convolutions via energy

@inproceedings{Giovanni2022GraphNN,
  title={Graph Neural Networks as Gradient Flows: understanding graph convolutions via energy},
  author={Francesco Di Giovanni and James R. Rowbottom and Benjamin Paul Chamberlain and Thomas Markovich and Michael Bronstein},
  year={2022}
}
Gradient flows are differential equations that minimize an energy functional and constitute the main descriptors of physical systems. We apply this formalism to Graph Neural Networks (GNNs) to develop new frameworks for learning on graphs as well as provide a better theoretical understanding of existing ones. We derive GNNs as a gradient flow equation of a parametric energy that provides a physics-inspired interpretation of GNNs as learning particle dynamics in the feature space. In particular, we… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 59 REFERENCES

Dirichlet Energy Constrained Learning for Deep Graph Neural Networks

A novel deep GNN framework – Energetic Graph Neural Networks (EGNN) is designed, which could provide lower and upper constraints in terms of Dirichlet energy at each layer to avoid over-smoothing.

Graph Neural Networks Exponentially Lose Expressive Power for Node Classification

The theory enables us to relate the expressive power of GCNs with the topological information of the underlying graphs inherent in the graph spectra and provides a principled guideline for weight normalization of graph NNs.

PDE-GCN: Novel Architectures for Graph Neural Networks Motivated by Partial Differential Equations

This work proposes a family of architectures to control the behaviour of graph neural networks by design, motivated by numerical methods for solving Partial Differential Equations (PDEs) on manifolds, and as such, their behaviour can be explained by similar analysis.

Analyzing the Expressive Power of Graph Neural Networks in a Spectral Perspective

It is argued that a spectral analysis of GNNs behavior can provide a complementary point of view to go one step further in the understanding of Gnns, and theoretically demonstrate some equivalence of the graph convolution process regardless of whether it is designed in the spatial or the spectral domain.

Diffusion Improves Graph Learning

This work removes the restriction of using only the direct neighbors by introducing a powerful, yet spatially localized graph convolution: Graph diffusion convolution (GDC), which leverages generalized graph diffusion and alleviates the problem of noisy and often arbitrarily defined edges in real graphs.

Graph Attention Networks

We present graph attention networks (GATs), novel neural network architectures that operate on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of prior

A Note on Over-Smoothing for Graph Neural Networks

It is shown that when the weight matrix satisfies the conditions determined by the spectrum of augmented normalized Laplacian, the Dirichlet energy of embeddings will converge to zero, resulting in the loss of discriminative power.

On the Bottleneck of Graph Neural Networks and its Practical Implications

It is shown that existing, extensively-tuned, GNN-based models suffer from over-squashing and that breaking the bottleneck improves state-of-the-art results without any hyperparameter tuning or additional weights.

Simplifying Graph Convolutional Networks

This paper successively removes nonlinearities and collapsing weight matrices between consecutive layers, and theoretically analyze the resulting linear model and show that it corresponds to a fixed low-pass filter followed by a linear classifier.

Geom-GCN: Geometric Graph Convolutional Networks

The proposed aggregation scheme is permutation-invariant and consists of three modules, node embedding, structural neighborhood, and bi-level aggregation, and an implementation of the scheme in graph convolutional networks, termed Geom-GCN, to perform transductive learning on graphs.
...