Graphon Signal Processing

@article{Ruiz2020GraphonSP,
  title={Graphon Signal Processing},
  author={Luana Ruiz and Luiz F. O. Chamon and Alejandro Ribeiro},
  journal={IEEE Transactions on Signal Processing},
  year={2020},
  volume={69},
  pages={4961-4976}
}
Graphons are infinite-dimensional objects that represent the limit of convergent sequences of graphs as their number of nodes goes to infinity. This paper derives a theory of graphon signal processing centered on the notions of graphon Fourier transform and linear shift invariant graphon filters, the graphon counterparts of the graph Fourier transform and graph filters. It is shown that for convergent sequences of graphs and associated graph signals: (i) the graph Fourier transform converges to… 

Figures and Tables from this paper

Transferability of coVariance Neural Networks and Application to Interpretable Brain Age Prediction using Anatomical Features

VNNs inherit the scale-free data processing architecture from GCNs and here, it is shown that VNNs exhibit transferability of performance over datasets whose covariance matrices converge to a limit object.

Graph Neural Tangent Kernel: Convergence on Large Graphs

It is proved that, on a sequence of growing graphs, the GNTKs converge to the graphon NTK, and it is implied that in the large-graph limit, theGNTK fitted on a graph of moderate size can be used to solve the same task on theLarge-graph and infer the learning dynamics of thelarge-graph GNN.

Graph Neural Networks: Architectures, Stability, and Transferability

It is shown that GNN architectures exhibit equivariance to permutation and stability to graph deformations, which justifies the transferability of GNNs across networks with different numbers of nodes.

Transferability Properties of Graph Neural Networks

The results show that graph filters have a transferability-discriminability tradeoff that in GNNs is alleviated by the scattering behavior of the nonlinearity, and this tradeoff is demonstrated empirically in a recommendation problem and in a decentralized control task.

Graphon Neural Networks and the Transferability of Graph Neural Networks

This paper introduces graphon NNs as limit objects of GNNs and proves a bound on the difference between the output of a GNN and its limit graphon-NN if the graph convolutional filters are bandlimited in the graph spectral domain.

A graphon-signal analysis of graph neural networks

  • Ron Levie
  • Computer Science
  • 2023
It is proved that MPNNs are Lipschitz continuous functions over the graphon-signal metric space, and the results apply to any regular enough MPNN on any distribution of graph-signals, making the analysis rather universal.

Signal processing on large networks with group symmetries

This manuscript summarizes some work on graph signal processing on large networks, in particular samples of Cayley graphons, which can be formalized via the theory of graph limits, where graphs are considered as random samples from a distribution represented by a graphon.

Frames for signal processing on Cayley graphs

This paper provides an explicit and detailed representation-theoretic account for the spectral decomposition of the adjacency matrix of a Cayley graph, which results in a preferred basis and uses such bases to build frames that are suitable for developing signal processing on Cayley graphs.

On distributional graph signals

This work proposes a unified framework that also encompasses existing theories regarding graph uncertainty that can be applied by using real datasets, and develops signal processing tools to study the new notion of distributional graph signals.

Convolutional Learning on Multigraphs

A multigraph learning architecture is developed, including a generalization of selection sampling to reduce computational complexity, and the introduced architecture is applied towards optimal wireless resource allocation and a hate speech localization task, offering improved performance over traditional graph neural networks.

Large Networks and Graph Limits

The book Large Networks and Graph Limits, xiv + 475 pp., published in late 2012, comprises five parts, the first an illuminating introduction and the last a tantalizing taste of how the scope of the

Graph Neural Networks: Architectures, Stability, and Transferability

It is shown that GNN architectures exhibit equivariance to permutation and stability to graph deformations, which justifies the transferability of GNNs across networks with different numbers of nodes.

A User Guide to Low-Pass Graph Signal Processing and Its Applications: Tools and Applications

This user guide illustrates how to leverage properties of low-pass graph filters to learn the graph topology and identify its community structure; efficiently represent graph data through sampling; recover missing measurements; and denoise graph data.

Graphon Neural Networks and the Transferability of Graph Neural Networks

This paper introduces graphon NNs as limit objects of GNNs and proves a bound on the difference between the output of a GNN and its limit graphon-NN if the graph convolutional filters are bandlimited in the graph spectral domain.

The Graphon Fourier Transform

This work defines graphon signals and introduces the Graphon Fourier Transform (WFT), to which the GFT is shown to converge, hinting to the possibility of centralizing analysis and design on graphons to leverage transferability.

Optimal Wireless Resource Allocation With Random Edge Graph Neural Networks

This work introduces the random edge graph neural network (REGNN), which performs convolutions over random graphs formed by the fading interference patterns in the wireless network, and presents an unsupervised model-free primal-dual learning algorithm to train the weights of the REGNN.

Graph Policy Gradients for Large Scale Robot Control

This paper proposes a new algorithm called Graph Policy Gradients (GPG) that exploits the underlying graph symmetry among the robots that is able to scale better than existing reinforcement methods that employ fully connected networks.

Stability Properties of Graph Neural Networks

This work proves that graph convolutions with integral Lipschitz filters, in combination with the frequency mixing effect of the corresponding nonlinearities, yields an architecture that is both stable to small changes in the underlying topology, and discriminative of information located at high frequencies.

Spectral Partitioning of Time-varying Networks with Unobserved Edges

A variant of ‘blind’ community detection, in which a network is partitioned from the observation of a (dynamical) graph signal defined on the network, is discussed, and a simple spectral algorithm is proposed for inferring the partition of the latent SBM.

Minimax Adaptive Estimation of Nonparametric Geometric Graphs

The recovery of graphons when they are convolution kernels on compact (symmetric) metric spaces is studied to offer an algorithmically and theoretically efficient procedure to estimate smooth Nonparametric Geometric Graphs (NGG).