Graph Neural Networks: Architectures, Stability, and Transferability
@article{Ruiz2021GraphNN, title={Graph Neural Networks: Architectures, Stability, and Transferability}, author={Luana Ruiz and Fernando Gama and Alejandro Ribeiro}, journal={Proceedings of the IEEE}, year={2021}, volume={109}, pages={660-682} }
Graph neural networks (GNNs) are information processing architectures for signals supported on graphs. They are presented here as generalizations of convolutional neural networks (CNNs) in which individual layers contain banks of graph convolutional filters instead of banks of classical convolutional filters. Otherwise, GNNs operate as CNNs. Filters are composed of pointwise nonlinearities and stacked in layers. It is shown that GNN architectures exhibit equivariance to permutation and…
Figures and Tables from this paper
34 Citations
Bayesian Estimation of Graph Signals
- Computer ScienceIEEE Transactions on Signal Processing
- 2022
This paper proposes a graph signal processing (GSP) framework for random graph signal recovery that utilizes information on the structure behind the data and presents three implementations of the parametric GSP-LMMSE estimator for typical graph filters, which are more robust to outliers and to network topology changes.
Convergence of Invariant Graph Networks
- Computer Science, MathematicsArXiv
- 2022
This paper first proves the stability of linear layers for general k-IGN (of order k) based on a novel interpretation of linear equivariant layers, and obtains the convergence of a subset of IGNs, denoted as IGN-small, after the edge probability estimation.
Deep Reinforcement Learning with Graph ConvNets for Distribution Network Voltage Control
- Computer ScienceArXiv
- 2022
This paper proposes a model-free Volt-VAR control (VVC) algorithm via the spatio-temporal graph ConvNet-based deep reinforcement learning (STGCN-DRL) framework, whose goal is to control smart…
Distributed Auto-Learning GNN for Multi-Cell Cluster-Free NOMA Communications
- Computer Science
- 2022
A novel distributed auto-learning graph neural network (AutoGNN) architecture is proposed to alleviate the overwhelming information exchange burdens among base stations (BSs) and it is theoretically proved that the proposed bi-level AutoGNN learning algorithm can converge to a stationary point.
Equivariant and Stable Positional Encoding for More Powerful Graph Neural Networks
- Computer ScienceArXiv
- 2022
This work revisits GNNs that allow using positional features of nodes given by positional encoding (PE) techniques such as Laplacian Eigenmap, Deepwalk, etc..
Graph-based Deep Learning for Communication Networks: A Survey
- Computer ScienceComput. Commun.
- 2022
Interpretable and Effective Reinforcement Learning for Attacking against Graph-based Rumor Detection
- Computer ScienceArXiv
- 2022
A black-box detector is designed with features capturing the dependencies to allow a reinforcement learning to learn an effective and interpretable attack policy based on the detector output, and a credit assignment method that decomposes delayed rewards to individual attacking steps proportional to their effects is devised.
On Local Distributions in Graph Signal Processing
- Computer ScienceArXiv
- 2022
This work proposes a framework that relies solely on the local distribution of the neighborhoods of a graph, able to describe graphs and graph signals in terms of a measurable space of rooted balls, and yields results on the convergence of spectral densities, transferability of filters across arbitrary graphs, and continuity of graph signal properties with respect to the distribution of local substructures.
Scalable Perception-Action-Communication Loops With Convolutional and Graph Neural Networks
- Computer ScienceIEEE Transactions on Signal and Information Processing over Networks
- 2022
Through a multi-agent flocking application, it is demonstrated that VGAI yields performance comparable to or better than other decentralized controllers, using only the visual input modality and without accessing precise location or motion state information.
Stability and Generalization Capabilities of Message Passing Graph Neural Networks
- Computer Science, MathematicsArXiv
- 2022
It is proven by showing that a MPNN, applied on a graph, approximates the MPNN applied on the geometric model that the graph discretizes, which decreases to zero as the graphs become larger.
References
SHOWING 1-10 OF 79 REFERENCES
Optimal Wireless Resource Allocation With Random Edge Graph Neural Networks
- Computer ScienceIEEE Transactions on Signal Processing
- 2020
This work introduces the random edge graph neural network (REGNN), which performs convolutions over random graphs formed by the fading interference patterns in the wireless network, and presents an unsupervised model-free primal-dual learning algorithm to train the weights of the REGNN.
Spectral Networks and Locally Connected Networks on Graphs
- Computer ScienceICLR
- 2014
This paper considers possible generalizations of CNNs to signals defined on more general domains without the action of a translation group, and proposes two constructions, one based upon a hierarchical clustering of the domain, and another based on the spectrum of the graph Laplacian.
Group Invariant Scattering
- MathematicsArXiv
- 2011
This paper constructs translation-invariant operators on L 2 .R d /, which are Lipschitz-continuous to the action of diffeomorphisms, and extendsScattering operators are extended on L2 .G/, where G is a compact Lie group, and are invariant under theaction of G.
Graph neural networks and the transferability of graph neural networks
- arXiv:2006.03548v1 [cs.LG], 5 June 2020. [Online]. Available: http://arxiv.org/abs/2006.03548
- 2006
How Powerful are Graph Neural Networks?
- Computer ScienceICLR
- 2019
This work characterize the discriminative power of popular GNN variants, such as Graph Convolutional Networks and GraphSAGE, and show that they cannot learn to distinguish certain simple graph structures, and develops a simple architecture that is provably the most expressive among the class of GNNs.
Rating Prediction via Graph Signal Processing
- Computer ScienceIEEE Transactions on Signal Processing
- 2018
New designs for recommendation systems inspired by recent advances in graph signal processing are developed, and it is demonstrated that linear latent factor models can be viewed as bandlimited interpolation algorithms that operate in a frequency domain given by the spectrum of a joint user and item network.
Optimal Graph-Filter Design and Applications to Distributed Linear Network Operators
- Computer Science, MathematicsIEEE Transactions on Signal Processing
- 2017
The notion of a node-variant GF, which allows the simultaneous implementation of multiple (regular) GFs in different nodes of the graph, is introduced, which enables the design of more general operators without undermining the locality in implementation.
Semi-Supervised Classification with Graph Convolutional Networks
- Computer ScienceICLR
- 2017
A scalable approach for semi-supervised learning on graph-structured data that is based on an efficient variant of convolutional neural networks which operate directly on graphs which outperforms related methods by a significant margin.
Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering
- Computer ScienceNIPS
- 2016
This work presents a formulation of CNNs in the context of spectral graph theory, which provides the necessary mathematical background and efficient numerical schemes to design fast localized convolutional filters on graphs.
Spectral Networks and Deep Locally Connected Networks on Graphs
- Computer Science
- 2014
This paper considers possible generalizations of CNNs to signals defined on more general domains without the action of a translation group, and proposes two constructions, one based upon a hierarchical clustering of the domain, and another based on the spectrum of the graph Laplacian.