Simplifying Graph Convolutional Networks
@article{Wu2019SimplifyingGC, title={Simplifying Graph Convolutional Networks}, author={Felix Wu and Tianyi Zhang and Amauri H. de Souza and Christopher Fifty and Tao Yu and Kilian Q. Weinberger}, journal={ArXiv}, year={2019}, volume={abs/1902.07153} }
Graph Convolutional Networks (GCNs) and their variants have experienced significant attention and have become the de facto methods for learning graph representations. [] Key Method We theoretically analyze the resulting linear model and show that it corresponds to a fixed low-pass filter followed by a linear classifier. Notably, our experimental evaluation demonstrates that these simplifications do not negatively impact accuracy in many downstream applications. Moreover, the resulting model scales to larger…
Figures and Tables from this paper
1,473 Citations
Simple and Deep Graph Convolutional Networks
- Computer ScienceICML
- 2020
The GCNII is proposed, an extension of the vanilla GCN model with two simple yet effective techniques: {\em Initial residual} and {\em Identity mapping} that effectively relieves the problem of over-smoothing.
Simplifying Graph Attention Networks with Source-Target Separation
- Computer ScienceECAI
- 2020
We present a novel Graph Neural Networks (GNN) architecture as an simplification of Graph Attentional Network (GAT) model with implicit computation of edge attention coefficients and shared…
A Graph Convolutional Network Composition Framework for Semi-supervised Classification
- Computer ScienceArXiv
- 2020
The empirical experimental results suggest that several newly composed variants of graph convolutional networks are useful alternatives to consider because they are as competitive as, or better than, the original GCN.
Simple Spectral Graph Convolution
- Computer ScienceICLR
- 2021
This paper uses a modified Markov Diffusion Kernel to derive a variant of GCN called Simple Spectral Graph Convolution (S2GC), and spectral analysis shows that the simple spectral graph convolution used in S2GC is a trade-off of low and high-pass filter bands which capture the global and local contexts of each node.
Non-Recursive Graph Convolutional Networks
- Computer ScienceICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
- 2021
This paper proposes a novel architecture named Non-Recursive Graph Convolutional Network (NRGCN) to improve both the training efficiency and the learning performance of GCNs in the context of node classification, and proposes to represent different hops of neighbors for each node based on inner-layer aggregation and layer-independent sampling.
Graph-Revised Convolutional Network
- Computer ScienceECML/PKDD
- 2020
A GCN-based graph revision module is introduced for predicting missing edges and revising edge weights w.r.t. downstream tasks via joint optimization, which shows that GRCN consistently outperforms strong baseline methods by a large margin.
Analysis of Graph Convolutional Networks using Neural Tangent Kernels
- Computer Science
- 2022
This paper derives NTKs corresponding to infinitely wide GCNs with and without skip connections and allowing non-linear output layer, and shows empirically that the approximation is similar to linear output layer.
Keep It Simple: Graph Autoencoders Without Graph Convolutional Networks
- Computer ScienceArXiv
- 2019
This paper proposes to replace the GCN encoder by a simple linear model w.r.t. the adjacency matrix of the graph, and empirically shows that this approach consistently reaches competitive performances on challenging tasks such as link prediction and node clustering.
Addressing Over-Smoothing in Graph Neural Networks via Deep Supervision
- Computer ScienceArXiv
- 2022
It is shown empirically that DSGNNs are resilient to over-smoothing and can outperform competitive benchmarks on node and graph property prediction problems.
Simplified multilayer graph convolutional networks with dropout
- Computer ScienceApplied Intelligence
- 2021
This paper presents simplified multilayer graph convolutional networks with dropout (DGCs), novel neural network architectures that successively perform nonlinearity removal and weight matrix merging between graph conventional layers, leveraging a dropout layer to achieve feature augmentation and effectively reduce overfitting.
References
SHOWING 1-10 OF 61 REFERENCES
FastGCN: Fast Learning with Graph Convolutional Networks via Importance Sampling
- Computer ScienceICLR
- 2018
Enhanced with importance sampling, FastGCN not only is efficient for training but also generalizes well for inference, and is orders of magnitude more efficient while predictions remain comparably accurate.
Graph Attention Networks
- Computer ScienceICLR
- 2018
We present graph attention networks (GATs), novel neural network architectures that operate on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of prior…
Deeper Insights into Graph Convolutional Networks for Semi-Supervised Learning
- Computer ScienceAAAI
- 2018
It is shown that the graph convolution of the GCN model is actually a special form of Laplacian smoothing, which is the key reason why GCNs work, but it also brings potential concerns of over-smoothing with many convolutional layers.
Deep Graph Infomax
- Computer ScienceICLR
- 2019
Deep Graph Infomax (DGI) is presented, a general approach for learning node representations within graph-structured data in an unsupervised manner that is readily applicable to both transductive and inductive learning setups.
Spectral Networks and Locally Connected Networks on Graphs
- Computer ScienceICLR
- 2014
This paper considers possible generalizations of CNNs to signals defined on more general domains without the action of a translation group, and proposes two constructions, one based upon a hierarchical clustering of the domain, and another based on the spectrum of the graph Laplacian.
Rethinking Knowledge Graph Propagation for Zero-Shot Learning
- Computer Science2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
- 2019
This work proposes a Dense Graph Propagation module with carefully designed direct links among distant nodes to exploit the hierarchical graph structure of the knowledge graph through additional connections and outperforms state-of-the-art zero-shot learning approaches.
An End-to-End Deep Learning Architecture for Graph Classification
- Computer ScienceAAAI
- 2018
This paper designs a localized graph convolution model and shows its connection with two graph kernels, and designs a novel SortPooling layer which sorts graph vertices in a consistent order so that traditional neural networks can be trained on the graphs.
Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering
- Computer ScienceNIPS
- 2016
This work presents a formulation of CNNs in the context of spectral graph theory, which provides the necessary mathematical background and efficient numerical schemes to design fast localized convolutional filters on graphs.
GaAN: Gated Attention Networks for Learning on Large and Spatiotemporal Graphs
- Computer ScienceUAI
- 2018
The effectiveness of GaAN on the inductive node classification problem is demonstrated, and the Graph Gated Recurrent Unit (GGRU) is constructed with GaAN as a building block to address the traffic speed forecasting problem.
Diffusion-Convolutional Neural Networks
- Computer ScienceNIPS
- 2016
Through the introduction of a diffusion-convolution operation, it is shown how diffusion-based representations can be learned from graph-structured data and used as an effective basis for node classification.