• Corpus ID: 67752026

# Simplifying Graph Convolutional Networks

@article{Wu2019SimplifyingGC,
title={Simplifying Graph Convolutional Networks},
author={Felix Wu and Tianyi Zhang and Amauri H. de Souza and Christopher Fifty and Tao Yu and Kilian Q. Weinberger},
journal={ArXiv},
year={2019},
volume={abs/1902.07153}
}
• Published 19 February 2019
• Computer Science
• ArXiv
Graph Convolutional Networks (GCNs) and their variants have experienced significant attention and have become the de facto methods for learning graph representations. [] Key Method We theoretically analyze the resulting linear model and show that it corresponds to a fixed low-pass filter followed by a linear classifier. Notably, our experimental evaluation demonstrates that these simplifications do not negatively impact accuracy in many downstream applications. Moreover, the resulting model scales to larger…
1,473 Citations

## Figures and Tables from this paper

• Computer Science
ICML
• 2020
The GCNII is proposed, an extension of the vanilla GCN model with two simple yet effective techniques: {\em Initial residual} and {\em Identity mapping} that effectively relieves the problem of over-smoothing.
• Computer Science
ECAI
• 2020
We present a novel Graph Neural Networks (GNN) architecture as an simplification of Graph Attentional Network (GAT) model with implicit computation of edge attention coefficients and shared
• Computer Science
ArXiv
• 2020
The empirical experimental results suggest that several newly composed variants of graph convolutional networks are useful alternatives to consider because they are as competitive as, or better than, the original GCN.
• Computer Science
ICLR
• 2021
This paper uses a modified Markov Diffusion Kernel to derive a variant of GCN called Simple Spectral Graph Convolution (S2GC), and spectral analysis shows that the simple spectral graph convolution used in S2GC is a trade-off of low and high-pass filter bands which capture the global and local contexts of each node.
• Computer Science
ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
• 2021
This paper proposes a novel architecture named Non-Recursive Graph Convolutional Network (NRGCN) to improve both the training efficiency and the learning performance of GCNs in the context of node classification, and proposes to represent different hops of neighbors for each node based on inner-layer aggregation and layer-independent sampling.
• Computer Science
ECML/PKDD
• 2020
A GCN-based graph revision module is introduced for predicting missing edges and revising edge weights w.r.t. downstream tasks via joint optimization, which shows that GRCN consistently outperforms strong baseline methods by a large margin.
• Computer Science
• 2022
This paper derives NTKs corresponding to inﬁnitely wide GCNs with and without skip connections and allowing non-linear output layer, and shows empirically that the approximation is similar to linear output layer.
• Computer Science
ArXiv
• 2019
This paper proposes to replace the GCN encoder by a simple linear model w.r.t. the adjacency matrix of the graph, and empirically shows that this approach consistently reaches competitive performances on challenging tasks such as link prediction and node clustering.
• Computer Science
ArXiv
• 2022
It is shown empirically that DSGNNs are resilient to over-smoothing and can outperform competitive benchmarks on node and graph property prediction problems.
• Computer Science
Applied Intelligence
• 2021
This paper presents simplified multilayer graph convolutional networks with dropout (DGCs), novel neural network architectures that successively perform nonlinearity removal and weight matrix merging between graph conventional layers, leveraging a dropout layer to achieve feature augmentation and effectively reduce overfitting.

## References

SHOWING 1-10 OF 61 REFERENCES

• Jie Chen
• Computer Science
ICLR
• 2018
Enhanced with importance sampling, FastGCN not only is efficient for training but also generalizes well for inference, and is orders of magnitude more efficient while predictions remain comparably accurate.
• Computer Science
ICLR
• 2018
We present graph attention networks (GATs), novel neural network architectures that operate on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of prior
• Computer Science
AAAI
• 2018
It is shown that the graph convolution of the GCN model is actually a special form of Laplacian smoothing, which is the key reason why GCNs work, but it also brings potential concerns of over-smoothing with many convolutional layers.
• Computer Science
ICLR
• 2019
Deep Graph Infomax (DGI) is presented, a general approach for learning node representations within graph-structured data in an unsupervised manner that is readily applicable to both transductive and inductive learning setups.
• Computer Science
ICLR
• 2014
This paper considers possible generalizations of CNNs to signals defined on more general domains without the action of a translation group, and proposes two constructions, one based upon a hierarchical clustering of the domain, and another based on the spectrum of the graph Laplacian.
• Computer Science
2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
• 2019
This work proposes a Dense Graph Propagation module with carefully designed direct links among distant nodes to exploit the hierarchical graph structure of the knowledge graph through additional connections and outperforms state-of-the-art zero-shot learning approaches.
• Computer Science
AAAI
• 2018
This paper designs a localized graph convolution model and shows its connection with two graph kernels, and designs a novel SortPooling layer which sorts graph vertices in a consistent order so that traditional neural networks can be trained on the graphs.
• Computer Science
NIPS
• 2016
This work presents a formulation of CNNs in the context of spectral graph theory, which provides the necessary mathematical background and efficient numerical schemes to design fast localized convolutional filters on graphs.
• Computer Science
UAI
• 2018
The effectiveness of GaAN on the inductive node classification problem is demonstrated, and the Graph Gated Recurrent Unit (GGRU) is constructed with GaAN as a building block to address the traffic speed forecasting problem.
• Computer Science
NIPS
• 2016
Through the introduction of a diffusion-convolution operation, it is shown how diffusion-based representations can be learned from graph-structured data and used as an effective basis for node classification.