# Graph Neural Networks with Low-rank Learnable Local Filters

@article{Cheng2020GraphNN, title={Graph Neural Networks with Low-rank Learnable Local Filters}, author={Xiuyuan Cheng and Zichen Miao and Qiang Qiu}, journal={ArXiv}, year={2020}, volume={abs/2008.01818} }

For the classification of graph data consisting of features sampled on an irregular coarse mesh like landmark points on face and human body, graph neural network (gnn) models based on global graph Laplacians may lack expressiveness to capture local features on graph. The current paper introduces a new gnn layer type with learnable low-rank local graph filters, which significantly reduces the complexity of traditional locally connected gnn. The architecture provides a unified framework for both…

## Figures and Tables from this paper

## References

SHOWING 1-10 OF 60 REFERENCES

CayleyNets: Graph Convolutional Neural Networks With Complex Rational Spectral Filters

- Computer ScienceIEEE Transactions on Signal Processing
- 2019

A new spectral domain convolutional architecture for deep learning on graphs with a new class of parametric rational complex functions (Cayley polynomials) allowing to efficiently compute spectral filters on graphs that specialize on frequency bands of interest.

How Powerful are Graph Neural Networks?

- Computer ScienceICLR
- 2019

This work characterize the discriminative power of popular GNN variants, such as Graph Convolutional Networks and GraphSAGE, and show that they cannot learn to distinguish certain simple graph structures, and develops a simple architecture that is provably the most expressive among the class of GNNs.

Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering

- Computer ScienceNIPS
- 2016

This work presents a formulation of CNNs in the context of spectral graph theory, which provides the necessary mathematical background and efficient numerical schemes to design fast localized convolutional filters on graphs.

Geometric Deep Learning on Graphs and Manifolds Using Mixture Model CNNs

- Computer Science2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
- 2017

This paper proposes a unified framework allowing to generalize CNN architectures to non-Euclidean domains (graphs and manifolds) and learn local, stationary, and compositional task-specific features and test the proposed method on standard tasks from the realms of image-, graph-and 3D shape analysis and show that it consistently outperforms previous approaches.

Adaptive Graph Convolutional Neural Networks

- Computer ScienceAAAI
- 2018

A generalized and flexible graph CNN taking data of arbitrary graph structure as input is proposed, in that way a task-driven adaptive graph is learned for each graph data while training.

A Comprehensive Survey on Graph Neural Networks

- Computer ScienceIEEE Transactions on Neural Networks and Learning Systems
- 2019

This article provides a comprehensive overview of graph neural networks (GNNs) in data mining and machine learning fields and proposes a new taxonomy to divide the state-of-the-art GNNs into four categories, namely, recurrent GNNS, convolutional GNN’s, graph autoencoders, and spatial–temporal Gnns.

Spectral Networks and Locally Connected Networks on Graphs

- Computer ScienceICLR
- 2014

This paper considers possible generalizations of CNNs to signals defined on more general domains without the action of a translation group, and proposes two constructions, one based upon a hierarchical clustering of the domain, and another based on the spectrum of the graph Laplacian.

The Graph Neural Network Model

- Computer ScienceIEEE Transactions on Neural Networks
- 2009

A new neural network model, called graph neural network (GNN) model, that extends existing neural network methods for processing the data represented in graph domains, and implements a function tau(G,n) isin IRm that maps a graph G and one of its nodes n into an m-dimensional Euclidean space.

Weisfeiler and Leman Go Neural: Higher-order Graph Neural Networks

- Computer ScienceAAAI
- 2019

It is shown that GNNs have the same expressiveness as the Weisfeiler-Leman graph isomorphism heuristic in terms of distinguishing non-isomorphic (sub-)graphs, and a generalization of GNN's is proposed, so-called $k$-dimensional GNNS ($k-GNNs), which can take higher-order graph structures at multiple scales into account.

Provably Powerful Graph Networks

- Computer ScienceNeurIPS
- 2019

This paper proposes a simple model that interleaves applications of standard Multilayer-Perceptron (MLP) applied to the feature dimension and matrix multiplication and shows that a reduced 2-order network containing just scaled identity operator, augmented with a single quadratic operation (matrix multiplication) has a provable 3-WL expressive power.