• Corpus ID: 219260864

Convergence and Stability of Graph Convolutional Networks on Large Random Graphs

@article{Keriven2020ConvergenceAS,
  title={Convergence and Stability of Graph Convolutional Networks on Large Random Graphs},
  author={Nicolas Keriven and Alberto Bietti and Samuel Vaiter},
  journal={ArXiv},
  year={2020},
  volume={abs/2006.01868}
}
We study properties of Graph Convolutional Networks (GCNs) by analyzing their behavior on standard models of random graphs, where nodes are represented by random latent variables and edges are drawn according to a similarity kernel. This allows us to overcome the difficulties of dealing with discrete notions such as isomorphisms on very large graphs, by considering instead more natural geometric aspects. We first study the convergence of GCNs to their continuous counterpart as the number of… 
The Role of Dimension in Graph Convolutional Networks
TLDR
This work shows how the embedding dimension affects the set of pairs of models that can be distinguished from one another and extends the theory to the setting of graphs with vertex colors that are potentially locally correlated to graph structure.
Convergence of Invariant Graph Networks
TLDR
The convergence of one powerful GNN, Invariant Graph Network (IGN) over graphs sampled from graphons is investigated and a subset of IGNs is obtained, denoted as IGN-small, after the edge probability estimation, which is shown to still contains function class rich enough that can approximate spectral GNNs arbitrarily well.
On the Universality of Graph Neural Networks on Large Random Graphs
TLDR
It is shown that c-SGNNs are strictly more powerful than c-GNNs in the continuous limit, and their universality on several random graph models of interest, including most SBMs and a large class of random geometric graphs are proved.
coVariance Neural Networks
TLDR
This work theoretically establishes the stability of VNNs to perturbations in the covariance matrix, implying an advantage over standard PCA-based data analysis approaches that are prone to instability due to principal components associated with close eigenvalues.
Generalised Implicit Neural Representations
TLDR
This work considers the problem of learning implicit neural representations (INRs) for signals on non-Euclidean domains and proposes a method to train INRs without knowing the underlying continuous domain, which is the case for most graph signals in nature.
OOD Link Prediction Generalization Capabilities of Message-Passing GNNs in Larger Test Graphs
TLDR
This work proposes and proves non-asymptotic bounds showing that link predictors based on permutation-equivariant (structural) node embeddings obtained by gMPNNs can converge to a random guess as test graphs get larger, and proposes a theoretically-sound g MPNN that outputs structural pairwise (2-node) embeddins.
Not too little, not too much: a theoretical analysis of graph (over)smoothing
TLDR
It is shown that graph smoothing restores some of the lost information, up to a certain point, by two phenomenon:Graph smoothing shrinks non-principal directions in the data faster than principal ones, which is useful for regression, and shrinks nodes within communities faster than they collapse together, which improves classification.
Stability and Generalization Capabilities of Message Passing Graph Neural Networks
TLDR
It is proven by showing that a MPNN, applied on a graph, approximates the MPNN applied on the geometric model that the graph discretizes, which decreases to zero as the graphs become larger.
FedGCN: Convergence and Communication Tradeoffs in Federated Training of Graph Convolutional Networks
TLDR
Federated Graph Convolutional Network (FedGCN) is introduced, which uses federated learning to train GCN models for semi-supervised node classification on large graphs with optimized convergence rate and communication cost.
Entropic Optimal Transport in Random Graphs
TLDR
This paper shows that it is possible to consistently estimate entropic-regularized Optimal Transport (OT) distances between groups of nodes in the latent space, and provides a general stability result forEntropic OT with respect to perturbations of the cost matrix.
...
...

References

SHOWING 1-10 OF 56 REFERENCES
The Power of Graph Convolutional Networks to Distinguish Random Graph Models
TLDR
A concrete, infinite class of graphons arising from stochastic block models that are well-separated in terms of cut distance and are indistinguishable by a GCN are exhibited.
On the equivalence between graph isomorphism testing and function approximation with GNNs
TLDR
It is proved that order-2 Graph G-invariant networks fail to distinguish non-isomorphic regular graphs with the same degree, and is extended to a new architecture, Ring-GNNs, which succeeds on distinguishing these graphs and provides improvements on real-world social network datasets.
Adaptive estimation of nonparametric geometric graphs
TLDR
This paper offers an algorithmically and theoretically efficient procedure to estimate smooth NGG and shows a non-asymptotic concentration result on the spectrum of integral operators defined by symmetric kernels (not necessarily positive).
How Powerful are Graph Neural Networks?
TLDR
This work characterize the discriminative power of popular GNN variants, such as Graph Convolutional Networks and GraphSAGE, and show that they cannot learn to distinguish certain simple graph structures, and develops a simple architecture that is provably the most expressive among the class of GNNs.
Stability Properties of Graph Neural Networks
TLDR
This work proves that graph convolutions with integral Lipschitz filters, in combination with the frequency mixing effect of the corresponding nonlinearities, yields an architecture that is both stable to small changes in the underlying topology, and discriminative of information located at high frequencies.
Diffusion Scattering Transforms on Graphs
TLDR
This work shows that scattering transforms can be generalized to non-Euclidean domains using diffusion wavelets, while preserving a notion of stability with respect to metric changes in the domain, measured with diffusion maps.
Stability of Graph Scattering Transforms
TLDR
This work extends scattering transforms to network data by using multiresolution graph wavelets, whose computation can be obtained by means of graph convolutions, and proves that the resulting graph scattering transforms are stable to metric perturbations of the underlying network.
Improved spectral convergence rates for graph Laplacians on epsilon-graphs and k-NN graphs
TLDR
The results show that the eigenvalues and eigenvectors of the graph Laplacian converge to those of the Laplace-Beltrami operator at a rate of $O(n^{-1/(m+4)})$, up to log factors, where m is the manifold dimension and $n$ is the number of vertices in the graph.
Transferability of Spectral Graph Convolutional Neural Networks
TLDR
It is shown that if two graphs discretize the same continuous metric space, then a spectral filter/ConvNet has approximately the same repercussion on both graphs, which is more permissive than the standard analysis.
Group invariance
  • stability to deformations, and complexity of deep convolutional representations. J. Mach. Learn. Res., 20:1–49
  • 2019
...
...