On The Stability of Graph Convolutional Neural Networks Under Edge Rewiring

  title={On The Stability of Graph Convolutional Neural Networks Under Edge Rewiring},
  author={Henry Kenlay and Dorina Thanou and Xiaowen Dong},
  journal={ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)},
  • Henry KenlayD. ThanouXiaowen Dong
  • Published 26 October 2020
  • Computer Science
  • ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
Graph neural networks are experiencing a surge of popularity within the machine learning community due to their ability to adapt to nonEuclidean domains and instil inductive biases. Despite this, their stability, i.e., their robustness to small perturbations in the input, is not yet well understood. Although there exists some results showing the stability of graph neural networks, most take the form of an upper bound on the magnitude of change due to a perturbation in the graph topology… 

Figures from this paper

Interpretable Stability Bounds for Spectral Graph Filters

This paper studies filter stability and provides a novel and interpretable upper bound on the change of filter output, where the bound is expressed in terms of the endpoint degrees of the deleted and newly added edges, as well as the spatial proximity of those edges.

Graph Neural Network Sensitivity Under Probabilistic Error Model

It is proved that the adjacency matrix under the error model is bounded by a function of graph size and error probability, and analytically specifies the upper bound of a normalized adjacencies matrix with self-loop added.

Graph-Time Convolutional Autoencoders

  • Computer Science
  • 2022

Learning Stable Graph Neural Networks via Spectral Regularization

A self-regularized graph neural network (SR-GNN) solution that improves the architecture stability by regularizing the frequency responses in the graph spectral domain and preserves the permutation equivariance.

Space-Time Graph Neural Networks with Stochastic Graph Perturbations

The stability properties of ST-GNNs are revisited and it is proved that they are stable to stochastic graph perturbations and enables the design of generalized convolutional architectures that jointly process time-varying graphs and time- varying signals.

Tree Mover's Distance: Bridging Graph Metrics and Stability of Graph Neural Networks

A pseudometric for attributed graphs, the Tree Mover’s Distance (TMD), is proposed and it is shown that TMD captures properties relevant to graph classification: a simple TMD-SVM performs competi- tively with standard GNNs.

Stability of Aggregation Graph Neural Networks

Stability bounds for the mapping operator associated to a generic Agg-GNN are derived, and conditions under which such operators can be stable to deformations are specified, and it is proved that the stability bounds are defined by the properties of the first layer of the CNN that acts on each node.

Graph-Time Convolutional Neural Networks: Architecture and Theoretical Analysis

The stability result shows GTCNNs are stable to spatial perturbations but there is an implicit trade-off between discriminability and robustness; i.e., the more complex the model, the less stable.

Learning Stochastic Graph Neural Networks with Constrained Variance

This work analyzes theoretically the variance of the SGNN output and identifies a trade-off between the stochastic robustness and the discrimination power, and analyzes the duality gap of the variance-constrained optimization problem and the converging behavior of the primal-dual learning procedure.

On the Stability of Low Pass Graph Filter with a Large Number of Edge Rewires

This work departs from the previous analysis and proves a bound on the stability of graph filter relying on the filter’s frequency response and shows that for stochastic block model graphs, the graph filter distance converges to a small constant when the number of nodes approaches infinity.



Stability Properties of Graph Neural Networks

This work proves that graph convolutions with integral Lipschitz filters, in combination with the frequency mixing effect of the corresponding nonlinearities, yields an architecture that is both stable to small changes in the underlying topology, and discriminative of information located at high frequencies.

Stability and Generalization of Graph Convolutional Neural Networks

This paper is the first to study stability bounds on graph learning in a semi-supervised setting and derive generalization bounds for GCNN models and shows that the algorithmic stability of a GCNN model depends upon the largest absolute eigenvalue of its graph convolution filter.

CayleyNets: Graph Convolutional Neural Networks With Complex Rational Spectral Filters

A new spectral domain convolutional architecture for deep learning on graphs with a new class of parametric rational complex functions (Cayley polynomials) allowing to efficiently compute spectral filters on graphs that specialize on frequency bands of interest.

On The Stability of Polynomial Spectral Graph Filters

This work first proves that polynomial graph filters are stable with respect to the change in the normalised graph Laplacian matrix, and shows empirically that properties of a structural perturbation, specifically the relative locality of the edges removed in a binary graph, effect the change of the normalisation graph LaPLacian.

Attacking Graph Convolutional Networks via Rewiring

This paper proposes a graph rewiring operation which affects the graph in a less noticeable way compared to adding/deleting edges, and analyzes how its generated perturbation to the graph structure affects the output of the target model.

Simplifying Graph Convolutional Networks

This paper successively removes nonlinearities and collapsing weight matrices between consecutive layers, and theoretically analyze the resulting linear model and show that it corresponds to a fixed low-pass filter followed by a linear classifier.

Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering

This work presents a formulation of CNNs in the context of spectral graph theory, which provides the necessary mathematical background and efficient numerical schemes to design fast localized convolutional filters on graphs.

A Comprehensive Survey on Graph Neural Networks

This article provides a comprehensive overview of graph neural networks (GNNs) in data mining and machine learning fields and proposes a new taxonomy to divide the state-of-the-art GNNs into four categories, namely, recurrent GNNS, convolutional GNN’s, graph autoencoders, and spatial–temporal Gnns.

Graph Neural Networks With Convolutional ARMA Filters

A novel graph convolutional layer inspired by the auto-regressive moving average (ARMA) filter is proposed that provides a more flexible frequency response, is more robust to noise, and better captures the global graph structure.

Semi-Supervised Classification with Graph Convolutional Networks

A scalable approach for semi-supervised learning on graph-structured data that is based on an efficient variant of convolutional neural networks which operate directly on graphs which outperforms related methods by a significant margin.