Convergence of Invariant Graph Networks
@article{Cai2022ConvergenceOI, title={Convergence of Invariant Graph Networks}, author={Chen Cai and Yusu Wang}, journal={ArXiv}, year={2022}, volume={abs/2201.10129} }
Although theoretical properties such as expressive power and over-smoothing of graph neural networks (GNN) have been extensively studied recently, its convergence property is a relatively new direction. In this paper, we investigate the convergence of one powerful GNN, Invariant Graph Network (IGN) over graphs sampled from graphons. We first prove the stability of linear layers for general k -IGN (of order k ) based on a novel interpretation of linear equivariant layers. Building upon this…
Figures and Tables from this paper
One Citation
Sign and Basis Invariant Networks for Spectral Graph Representation Learning
- Computer ScienceArXiv
- 2022
SignNet and BasisNet are introduced -- new neural architectures that are invariant to two key symmetries displayed by eigenvectors, and it is proved that under certain conditions their networks are universal, i.e., they can approximate any continuous function of eigenspaces with the desired invariances.
50 References
Expressive Power of Invariant and Equivariant Graph Neural Networks
- Computer ScienceICLR
- 2021
It is proved that the first approximation guarantees for practical GNNs are proved, paving the way for a better understanding of their generalization.
Convergence and Stability of Graph Convolutional Networks on Large Random Graphs
- Computer Science, MathematicsNeurIPS
- 2020
This work studies properties of Graph Convolutional Networks by analyzing their behavior on standard models of random graphs, and provides more intuitive deformation-based metrics for understanding stability, which have proven useful for explaining the success of convolutional representations on Euclidean domains.
On the Universality of Graph Neural Networks on Large Random Graphs
- Computer Science, MathematicsNeurIPS
- 2021
It is shown that c-SGNNs are strictly more powerful than c-GNNs in the continuous limit, and their universality on several random graph models of interest, including most SBMs and a large class of random geometric graphs are proved.
Universal Invariant and Equivariant Graph Neural Networks
- Mathematics, Computer ScienceNeurIPS
- 2019
The results show that a GNN defined by a single set of parameters can approximate uniformly well a function defined on graphs of varying size.
Stability Properties of Graph Neural Networks
- Computer ScienceIEEE Transactions on Signal Processing
- 2020
This work proves that graph convolutions with integral Lipschitz filters, in combination with the frequency mixing effect of the corresponding nonlinearities, yields an architecture that is both stable to small changes in the underlying topology, and discriminative of information located at high frequencies.
Provably Powerful Graph Networks
- Computer ScienceNeurIPS
- 2019
This paper proposes a simple model that interleaves applications of standard Multilayer-Perceptron (MLP) applied to the feature dimension and matrix multiplication and shows that a reduced 2-order network containing just scaled identity operator, augmented with a single quadratic operation (matrix multiplication) has a provable 3-WL expressive power.
Invariant and Equivariant Graph Networks
- Computer Science, MathematicsICLR
- 2019
This paper provides a characterization of all permutation invariant and equivariant linear layers for (hyper-)graph data, and shows that their dimension, in case of edge-value graph data, is 2 and 15, respectively.
A Note on Over-Smoothing for Graph Neural Networks
- Computer ScienceArXiv
- 2020
It is shown that when the weight matrix satisfies the conditions determined by the spectrum of augmented normalized Laplacian, the Dirichlet energy of embeddings will converge to zero, resulting in the loss of discriminative power.
How Powerful are Graph Neural Networks?
- Computer ScienceICLR
- 2019
This work characterize the discriminative power of popular GNN variants, such as Graph Convolutional Networks and GraphSAGE, and show that they cannot learn to distinguish certain simple graph structures, and develops a simple architecture that is provably the most expressive among the class of GNNs.
The expressive power of kth-order invariant graph networks
- MathematicsArXiv
- 2020
It is shown that k-IGNs are bounded in expressive power by k-WL, which implies that k -IGNs and k- WL are equally powerful in distinguishing graphs.