# Stability Properties of Graph Neural Networks

@article{Gama2020StabilityPO, title={Stability Properties of Graph Neural Networks}, author={Fernando Gama and Joan Bruna and Alejandro Ribeiro}, journal={IEEE Transactions on Signal Processing}, year={2020}, volume={68}, pages={5680-5695} }

Graph neural networks (GNNs) have emerged as a powerful tool for nonlinear processing of graph signals, exhibiting success in recommender systems, power outage prediction, and motion planning, among others. GNNs consist of a cascade of layers, each of which applies a graph convolution, followed by a pointwise nonlinearity. In this work, we study the impact that changes in the underlying topology have on the output of the GNN. First, we show that GNNs are permutation equivariant, which implies…

## 101 Citations

Stability of Graph Neural Networks to Relative Perturbations

- Computer ScienceICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
- 2020

It is proved that graph convolutions with integral Lipschitz filters lead to GNNs whose output change is bounded by the size of the relative change in the topology.

Graphs, Convolutions, and Neural Networks

- Computer ScienceArXiv
- 2020

The role of graph convolutional filters in GNNs is discussed and it is shown that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the graph.

Stability of Graph Convolutional Neural Networks to Stochastic Perturbations

- Computer ScienceSignal Process.
- 2021

Graph-Adaptive Activation Functions for Graph Neural Networks

- Computer Science2020 IEEE 30th International Workshop on Machine Learning for Signal Processing (MLSP)
- 2020

This paper proposes activation functions for GNNs that not only adapt to the graph into the nonlinearity, but are also distributable, and proves the subclass of graph-adaptive max activation functions are Lipschitz stable to input perturbations.

From Graph Filters to Graph Neural Networks

- Computer Science
- 2020

The role of graph convolutional filters in GNNs is discussed and it is shown that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.

p-Laplacian Based Graph Neural Networks

- Computer ScienceArXiv
- 2021

A new p-Laplacian based GNN model is proposed, termed as GNN, whose message passing mechanism is derived from a discrete regularization framework and could be theoretically explained as an approximation of a polynomial graph filter defined on the spectral domain of p- Laplacians.

Discriminability of Single-Layer Graph Neural Networks

- Computer ScienceICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
- 2021

A notion of discriminability tied to the stability of the architecture is defined, it is shown that GNNs are at least as discriminative as linear graph filter banks, and the signals that cannot be discriminated by either are characterized.

Convergence of Invariant Graph Networks

- Computer Science, MathematicsArXiv
- 2022

This paper first proves the stability of linear layers for general k-IGN (of order k) based on a novel interpretation of linear equivariant layers, and obtains the convergence of a subset of IGNs, denoted as IGN-small, after the edge probability estimation.

Stability of Neural Networks on Manifolds to Relative Perturbations

- Computer Science, MathematicsArXiv
- 2021

The stability properties of convolutional neural networks on manifolds are analyzed to understand the stability of GNNs on large graphs and it is observed that manifold neural networks exhibit a trade-off between stability and discriminability.

Robust Graph Neural Networks via Probabilistic Lipschitz Constraints

- Computer ScienceL4DC
- 2022

Motivated by controlling the Lipschitz constant of GNN filters with respect to the node attributes, this work proposes to constrain the frequency response of the GNN’s filter banks, and extends this formulation to the dynamic graph setting using a continuous frequency response constraint.

## References

SHOWING 1-10 OF 69 REFERENCES

From Graph Filters to Graph Neural Networks

- Computer Science
- 2020

The role of graph convolutional filters in GNNs is discussed and it is shown that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.

How Powerful are Graph Neural Networks?

- Computer ScienceICLR
- 2019

This work characterize the discriminative power of popular GNN variants, such as Graph Convolutional Networks and GraphSAGE, and show that they cannot learn to distinguish certain simple graph structures, and develops a simple architecture that is provably the most expressive among the class of GNNs.

Invariance-Preserving Localized Activation Functions for Graph Neural Networks

- Computer ScienceIEEE Transactions on Signal Processing
- 2020

This paper considers the design of trainable nonlinear activation functions that take into consideration the structure of the graph by using graph median filters and graph max filters, which mimic linear graph convolutions and are shown to retain the permutation invariance of GNNs.

Convolutional Graph Neural Networks

- Computer Science2019 53rd Asilomar Conference on Signals, Systems, and Computers
- 2019

It is shown that the graph convolution can be interpreted as either a diffusion or aggregation operation, which leads to different generalizations which are term selection and aggregation GNNs.

EdgeNets: Edge Varying Graph Neural Networks

- Computer ScienceIEEE transactions on pattern analysis and machine intelligence
- 2021

A general framework that unifies state-of-the-art graph neural networks (GNNs) through the concept of EdgeNet is put forth and it is shown that GATs are GCNNs on a graph that is learned from the features, which opens the doors to develop alternative attention mechanisms for improving discriminatory power.

Network Topology Inference from Spectral Templates

- Computer ScienceIEEE Transactions on Signal and Information Processing over Networks
- 2017

The novel idea is to find a graph shift that, while being consistent with the provided spectral information, endows the network with certain desired properties such as sparsity, and develops efficient inference algorithms stemming from provably tight convex relaxations of natural nonconvex criteria.

Universal Invariant and Equivariant Graph Neural Networks

- Mathematics, Computer ScienceNeurIPS
- 2019

The results show that a GNN defined by a single set of parameters can approximate uniformly well a function defined on graphs of varying size.

Convolutional Neural Network Architectures for Signals Supported on Graphs

- Computer ScienceIEEE Transactions on Signal Processing
- 2019

Two architectures that generalize convolutional neural networks (CNNs) for the processing of signals supported on graphs are introduced and Multinode aggregation GNNs are consistently the best-performing GNN architecture for operation in large-scale graphs.

Power up! Robust Graph Convolutional Network against Evasion Attacks based on Graph Powering

- Computer ScienceArXiv
- 2019

A robust learning paradigm is proposed, where the network is trained on a family of "'smoothed" graphs that span a spatial and spectral range for generalizability and the new operator is used in replacement of the classical Laplacian to construct an architecture with improved spectral robustness, expressivity and interpretability.

On the Transferability of Spectral Graph Filters

- Computer Science2019 13th International conference on Sampling Theory and Applications (SampTA)
- 2019

It is proved that graph spectral filters are transferable by combining stability with the known property of equivariance, and introduced a space of filters, called the Cayley smoothness space, that contains the filters of state-of-the-art spectral filtering methods, and whose filters can approximate any generic spectral filter.