Stability Properties of Graph Neural Networks

@article{Gama2020StabilityPO,
  title={Stability Properties of Graph Neural Networks},
  author={Fernando Gama and Joan Bruna and Alejandro Ribeiro},
  journal={IEEE Transactions on Signal Processing},
  year={2020},
  volume={68},
  pages={5680-5695}
}
Graph neural networks (GNNs) have emerged as a powerful tool for nonlinear processing of graph signals, exhibiting success in recommender systems, power outage prediction, and motion planning, among others. GNNs consist of a cascade of layers, each of which applies a graph convolution, followed by a pointwise nonlinearity. In this work, we study the impact that changes in the underlying topology have on the output of the GNN. First, we show that GNNs are permutation equivariant, which implies… 

Figures from this paper

Stability of Graph Neural Networks to Relative Perturbations
TLDR
It is proved that graph convolutions with integral Lipschitz filters lead to GNNs whose output change is bounded by the size of the relative change in the topology.
Graphs, Convolutions, and Neural Networks
TLDR
The role of graph convolutional filters in GNNs is discussed and it is shown that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the graph.
Stability of Graph Convolutional Neural Networks to Stochastic Perturbations
Graph-Adaptive Activation Functions for Graph Neural Networks
TLDR
This paper proposes activation functions for GNNs that not only adapt to the graph into the nonlinearity, but are also distributable, and proves the subclass of graph-adaptive max activation functions are Lipschitz stable to input perturbations.
From Graph Filters to Graph Neural Networks
TLDR
The role of graph convolutional filters in GNNs is discussed and it is shown that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.
p-Laplacian Based Graph Neural Networks
TLDR
A new p-Laplacian based GNN model is proposed, termed as GNN, whose message passing mechanism is derived from a discrete regularization framework and could be theoretically explained as an approximation of a polynomial graph filter defined on the spectral domain of p- Laplacians.
Discriminability of Single-Layer Graph Neural Networks
TLDR
A notion of discriminability tied to the stability of the architecture is defined, it is shown that GNNs are at least as discriminative as linear graph filter banks, and the signals that cannot be discriminated by either are characterized.
Convergence of Invariant Graph Networks
TLDR
This paper first proves the stability of linear layers for general k-IGN (of order k) based on a novel interpretation of linear equivariant layers, and obtains the convergence of a subset of IGNs, denoted as IGN-small, after the edge probability estimation.
Stability of Neural Networks on Manifolds to Relative Perturbations
TLDR
The stability properties of convolutional neural networks on manifolds are analyzed to understand the stability of GNNs on large graphs and it is observed that manifold neural networks exhibit a trade-off between stability and discriminability.
Robust Graph Neural Networks via Probabilistic Lipschitz Constraints
TLDR
Motivated by controlling the Lipschitz constant of GNN filters with respect to the node attributes, this work proposes to constrain the frequency response of the GNN’s filter banks, and extends this formulation to the dynamic graph setting using a continuous frequency response constraint.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 69 REFERENCES
From Graph Filters to Graph Neural Networks
TLDR
The role of graph convolutional filters in GNNs is discussed and it is shown that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.
How Powerful are Graph Neural Networks?
TLDR
This work characterize the discriminative power of popular GNN variants, such as Graph Convolutional Networks and GraphSAGE, and show that they cannot learn to distinguish certain simple graph structures, and develops a simple architecture that is provably the most expressive among the class of GNNs.
Invariance-Preserving Localized Activation Functions for Graph Neural Networks
TLDR
This paper considers the design of trainable nonlinear activation functions that take into consideration the structure of the graph by using graph median filters and graph max filters, which mimic linear graph convolutions and are shown to retain the permutation invariance of GNNs.
Convolutional Graph Neural Networks
TLDR
It is shown that the graph convolution can be interpreted as either a diffusion or aggregation operation, which leads to different generalizations which are term selection and aggregation GNNs.
EdgeNets: Edge Varying Graph Neural Networks
TLDR
A general framework that unifies state-of-the-art graph neural networks (GNNs) through the concept of EdgeNet is put forth and it is shown that GATs are GCNNs on a graph that is learned from the features, which opens the doors to develop alternative attention mechanisms for improving discriminatory power.
Network Topology Inference from Spectral Templates
TLDR
The novel idea is to find a graph shift that, while being consistent with the provided spectral information, endows the network with certain desired properties such as sparsity, and develops efficient inference algorithms stemming from provably tight convex relaxations of natural nonconvex criteria.
Universal Invariant and Equivariant Graph Neural Networks
TLDR
The results show that a GNN defined by a single set of parameters can approximate uniformly well a function defined on graphs of varying size.
Convolutional Neural Network Architectures for Signals Supported on Graphs
TLDR
Two architectures that generalize convolutional neural networks (CNNs) for the processing of signals supported on graphs are introduced and Multinode aggregation GNNs are consistently the best-performing GNN architecture for operation in large-scale graphs.
Power up! Robust Graph Convolutional Network against Evasion Attacks based on Graph Powering
TLDR
A robust learning paradigm is proposed, where the network is trained on a family of "'smoothed" graphs that span a spatial and spectral range for generalizability and the new operator is used in replacement of the classical Laplacian to construct an architecture with improved spectral robustness, expressivity and interpretability.
On the Transferability of Spectral Graph Filters
TLDR
It is proved that graph spectral filters are transferable by combining stability with the known property of equivariance, and introduced a space of filters, called the Cayley smoothness space, that contains the filters of state-of-the-art spectral filtering methods, and whose filters can approximate any generic spectral filter.
...
1
2
3
4
5
...