On The Stability of Graph Convolutional Neural Networks Under Edge Rewiring

@article{Kenlay2021OnTS,
  title={On The Stability of Graph Convolutional Neural Networks Under Edge Rewiring},
  author={Henry Kenlay and Dorina Thanou and Xiaowen Dong},
  journal={ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)},
  year={2021},
  pages={8513-8517}
}
  • Henry Kenlay, D. Thanou, Xiaowen Dong
  • Published 26 October 2020
  • Computer Science
  • ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
Graph neural networks are experiencing a surge of popularity within the machine learning community due to their ability to adapt to nonEuclidean domains and instil inductive biases. Despite this, their stability, i.e., their robustness to small perturbations in the input, is not yet well understood. Although there exists some results showing the stability of graph neural networks, most take the form of an upper bound on the magnitude of change due to a perturbation in the graph topology… 

Figures from this paper

Interpretable Stability Bounds for Spectral Graph Filters

TLDR
This paper studies filter stability and provides a novel and interpretable upper bound on the change of filter output, where the bound is expressed in terms of the endpoint degrees of the deleted and newly added edges, as well as the spatial proximity of those edges.

Graph Neural Network Sensitivity Under Probabilistic Error Model

TLDR
It is proved that the adjacency matrix under the error model is bounded by a function of graph size and error probability, and analytically specify the upper bound of a normalized adjacencies matrix with self-loop added.

Stability of Aggregation Graph Neural Networks

TLDR
Stability bounds for the mapping operator associated to a generic Agg-GNN are derived, and conditions under which such operators can be stable to deformations are specified, and it is proved that the stability bounds are defined by the properties of the first layer of the CNN that acts on each node.

Graph-Time Convolutional Neural Networks: Architecture and Theoretical Analysis

TLDR
The stability result shows GTCNNs are stable to spatial perturbations but there is an implicit trade-off between discriminability and robustness; i.e., the more complex the model, the less stable.

Learning Stochastic Graph Neural Networks with Constrained Variance

TLDR
This work analyzes theoretically the variance of the SGNN output and identifies a trade-off between the stochastic robustness and the discrimination power, and analyzes the duality gap of the variance-constrained optimization problem and the converging behavior of the primal-dual learning procedure.

On the Stability of Low Pass Graph Filter with a Large Number of Edge Rewires

TLDR
This work departs from the previous analysis and proves a bound on the stability of graph filter relying on the filter’s frequency response and shows that for stochastic block model graphs, the graph filter distance converges to a small constant when the number of nodes approaches infinity.

Node-Variant Graph Filters in Graph Neural Networks

Graph neural networks (GNNs) have been successfully employed in a myriad of applications involving graph signals. Theoretical findings establish that GNNs use nonlinear activation functions to create

Transferability of Graph Neural Networks: an Extended Graphon Approach

TLDR
A model of transferability based on graphon analysis is considered, proving transferability for graphs that approximate unbounded graphon shift operators and obtaining non-asymptotic approximation results, proving linear stability of GCNNs.

References

SHOWING 1-10 OF 28 REFERENCES

Stability Properties of Graph Neural Networks

TLDR
This work proves that graph convolutions with integral Lipschitz filters, in combination with the frequency mixing effect of the corresponding nonlinearities, yields an architecture that is both stable to small changes in the underlying topology, and discriminative of information located at high frequencies.

Stability and Generalization of Graph Convolutional Neural Networks

TLDR
This paper is the first to study stability bounds on graph learning in a semi-supervised setting and derive generalization bounds for GCNN models and shows that the algorithmic stability of a GCNN model depends upon the largest absolute eigenvalue of its graph convolution filter.

CayleyNets: Graph Convolutional Neural Networks With Complex Rational Spectral Filters

TLDR
A new spectral domain convolutional architecture for deep learning on graphs with a new class of parametric rational complex functions (Cayley polynomials) allowing to efficiently compute spectral filters on graphs that specialize on frequency bands of interest.

On The Stability of Polynomial Spectral Graph Filters

TLDR
This work first proves that polynomial graph filters are stable with respect to the change in the normalised graph Laplacian matrix, and shows empirically that properties of a structural perturbation, specifically the relative locality of the edges removed in a binary graph, effect the change of the normalisation graph LaPLacian.

Attacking Graph Convolutional Networks via Rewiring

TLDR
This paper proposes a graph rewiring operation which affects the graph in a less noticeable way compared to adding/deleting edges, and analyzes how its generated perturbation to the graph structure affects the output of the target model.

Simplifying Graph Convolutional Networks

TLDR
This paper successively removes nonlinearities and collapsing weight matrices between consecutive layers, and theoretically analyze the resulting linear model and show that it corresponds to a fixed low-pass filter followed by a linear classifier.

Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering

TLDR
This work presents a formulation of CNNs in the context of spectral graph theory, which provides the necessary mathematical background and efficient numerical schemes to design fast localized convolutional filters on graphs.

A Comprehensive Survey on Graph Neural Networks

TLDR
This article provides a comprehensive overview of graph neural networks (GNNs) in data mining and machine learning fields and proposes a new taxonomy to divide the state-of-the-art GNNs into four categories, namely, recurrent GNNS, convolutional GNN’s, graph autoencoders, and spatial–temporal Gnns.

Graph Neural Networks With Convolutional ARMA Filters

TLDR
A novel graph convolutional layer inspired by the auto-regressive moving average (ARMA) filter is proposed that provides a more flexible frequency response, is more robust to noise, and better captures the global graph structure.

Semi-Supervised Classification with Graph Convolutional Networks

TLDR
A scalable approach for semi-supervised learning on graph-structured data that is based on an efficient variant of convolutional neural networks which operate directly on graphs which outperforms related methods by a significant margin.