# Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering

@inproceedings{Defferrard2016ConvolutionalNN, title={Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering}, author={Micha{\"e}l Defferrard and Xavier Bresson and Pierre Vandergheynst}, booktitle={NIPS}, year={2016} }

In this work, we are interested in generalizing convolutional neural networks (CNNs) from low-dimensional regular grids, where image, video and speech are represented, to high-dimensional irregular domains, such as social networks, brain connectomes or words' embedding, represented by graphs. [] Key Method Importantly, the proposed technique offers the same linear computational complexity and constant learning complexity as classical CNNs, while being universal to any graph structure. Experiments on MNIST…

## 4,906 Citations

### Dynamic Edge-Conditioned Filters in Convolutional Neural Networks on Graphs

- Computer Science2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
- 2017

This work generalizes the convolution operator from regular grids to arbitrary graphs while avoiding the spectral domain, which allows us to handle graphs of varying size and connectivity.

### Motif-based Convolutional Neural Network on Graphs

- Computer ScienceArXiv
- 2017

A novel deep architecture Motif-CNN is developed that employs an attention model to combine the features extracted from multiple patterns, thus effectively capturing high-order structural and feature information.

### A Generalization of Convolutional Neural Networks to Graph-Structured Data

- Computer ScienceArXiv
- 2017

A novel spatial convolution utilizing a random walk to uncover the relations within the input, analogous to the way the standard convolution uses the spatial neighborhood of a pixel on the grid is proposed.

### Dynamic Filters in Graph Convolutional Networks

- Computer ScienceArXiv
- 2017

This work proposes a novel graph-convolutional network architecture that builds on a generic formulation that relaxes the 1-to-1 correspondence between filter weights and data elements around the center of the convolution.

### Graph Classification with 2 D Convolutional Neural Networks

- Computer Science
- 2018

This paper introduces a novel way to represent graphs as multi-channel image-like structures that allows them to be handled by vanilla 2D CNNs, and reveals that this method is more accurate than state-of-the-art graph kernels and graph CNNs on 4 out of 6 real-world datasets.

### Learning Graph While Training: An Evolving Graph Convolutional Neural Network

- Computer ScienceArXiv
- 2017

A more general and flexible graph convolution network (EGCN) fed by batch of arbitrarily shaped data together with their evolving graph Laplacians trained in supervised fashion is proposed.

### Robust Spatial Filtering With Graph Convolutional Neural Networks

- Computer ScienceIEEE Journal of Selected Topics in Signal Processing
- 2017

A novel neural learning framework that is capable of handling both homogeneous and heterogeneous data while retaining the benefits of traditional CNN successes is proposed, which is term Graph-CNNs, which can handle both heterogeneous and homogeneous graph data.

### Graph Neural Networks With Convolutional ARMA Filters

- Computer ScienceIEEE Transactions on Pattern Analysis and Machine Intelligence
- 2022

A novel graph convolutional layer inspired by the auto-regressive moving average (ARMA) filter is proposed that provides a more flexible frequency response, is more robust to noise, and better captures the global graph structure.

### Classifying Graphs as Images with Convolutional Neural Networks

- Computer ScienceArXiv
- 2017

This paper shows that a classical 2D architecture designed for images can also be used for graph processing in a completely off-the-shelf manner; the only prerequisite being to encode graphs as stacks of two-dimensional histograms of their node embeddings.

### CAYLEYNETS: SPECTRAL GRAPH CNNS WITH COMPLEX RATIONAL FILTERS

- Computer Science
- 2018

A new spectral domain convolutional architecture for deep learning on graphs with a new class of parametric rational complex functions (Cayley polynomials) allowing to efficiently compute spectral filters on graphs that specialize on frequency bands of interest.

## References

SHOWING 1-10 OF 45 REFERENCES

### Spectral Networks and Locally Connected Networks on Graphs

- Computer ScienceICLR
- 2014

This paper considers possible generalizations of CNNs to signals defined on more general domains without the action of a translation group, and proposes two constructions, one based upon a hierarchical clustering of the domain, and another based on the spectrum of the graph Laplacian.

### Deep Convolutional Networks on Graph-Structured Data

- Computer ScienceArXiv
- 2015

This paper develops an extension of Spectral Networks which incorporates a Graph Estimation procedure, that is test on large-scale classification problems, matching or improving over Dropout Networks with far less parameters to estimate.

### Geodesic Convolutional Neural Networks on Riemannian Manifolds

- Computer Science, Mathematics2015 IEEE International Conference on Computer Vision Workshop (ICCVW)
- 2015

Geodesic Convolutional Neural Networks (GCNN), a generalization of the convolutional neural networks (CNN) paradigm to non-Euclidean manifolds is introduced, allowing to achieve state-of-the-art performance in problems such as shape description, retrieval, and correspondence.

### Wavelets on Graphs via Deep Learning

- Computer ScienceNIPS
- 2013

A machine learning framework for constructing graph wavelets that can sparsely represent a given class of signals and a linear wavelet transform that can be applied to any graph signal in time and memory linear in the size of the graph is introduced.

### ShapeNet: Convolutional Neural Networks on Non-Euclidean Manifolds

- Computer ScienceArXiv
- 2015

This paper uses ShapeNet to learn invariant shape feature descriptors that significantly outperform recent state-of-the-art methods, and shows that previous approaches such as heat and wave kernel signatures, optimal spectral descriptors, and intrinsic shape contexts can be obtained as particular configurations of ShapeNet.

### Gated Graph Sequence Neural Networks

- Computer ScienceICLR
- 2016

This work studies feature learning techniques for graph-structured inputs and achieves state-of-the-art performance on a problem from program verification, in which subgraphs need to be matched to abstract data structures.

### Spatial Transformer Networks

- Computer ScienceNIPS
- 2015

This work introduces a new learnable module, the Spatial Transformer, which explicitly allows the spatial manipulation of data within the network, and can be inserted into existing convolutional architectures, giving neural networks the ability to actively spatially transform feature maps.

### The emerging field of signal processing on graphs: Extending high-dimensional data analysis to networks and other irregular domains

- Computer ScienceIEEE Signal Processing Magazine
- 2013

This tutorial overview outlines the main challenges of the emerging field of signal processing on graphs, discusses different ways to define graph spectral domains, which are the analogs to the classical frequency domain, and highlights the importance of incorporating the irregular structures of graph data domains when processing signals on graphs.

### Selecting Receptive Fields in Deep Networks

- Computer ScienceNIPS
- 2011

This paper proposes a fast method to choose local receptive fields that group together those low-level features that are most similar to each other according to a pairwise similarity metric, and produces results showing how this method allows even simple unsupervised training algorithms to train successful multi-layered networks that achieve state-of-the-art results on CIFAR and STL datasets.

### Multiscale Wavelets on Trees, Graphs and High Dimensional Data: Theory and Applications to Semi Supervised Learning

- Computer ScienceICML
- 2010

It is proved that in analogy to the Euclidean case, function smoothness with respect to a specific metric induced by the tree is equivalent to exponential rate of coefficient decay, that is, to approximate sparsity, which readily translate to simple practical algorithms for various learning tasks.