• Corpus ID: 67752026

Simplifying Graph Convolutional Networks

@article{Wu2019SimplifyingGC,
  title={Simplifying Graph Convolutional Networks},
  author={Felix Wu and Tianyi Zhang and Amauri H. de Souza and Christopher Fifty and Tao Yu and Kilian Q. Weinberger},
  journal={ArXiv},
  year={2019},
  volume={abs/1902.07153}
}
Graph Convolutional Networks (GCNs) and their variants have experienced significant attention and have become the de facto methods for learning graph representations. [] Key Method We theoretically analyze the resulting linear model and show that it corresponds to a fixed low-pass filter followed by a linear classifier. Notably, our experimental evaluation demonstrates that these simplifications do not negatively impact accuracy in many downstream applications. Moreover, the resulting model scales to larger…

Simple and Deep Graph Convolutional Networks

The GCNII is proposed, an extension of the vanilla GCN model with two simple yet effective techniques: {\em Initial residual} and {\em Identity mapping} that effectively relieves the problem of over-smoothing.

Simplifying Graph Attention Networks with Source-Target Separation

We present a novel Graph Neural Networks (GNN) architecture as an simplification of Graph Attentional Network (GAT) model with implicit computation of edge attention coefficients and shared

A Graph Convolutional Network Composition Framework for Semi-supervised Classification

The empirical experimental results suggest that several newly composed variants of graph convolutional networks are useful alternatives to consider because they are as competitive as, or better than, the original GCN.

Simple Spectral Graph Convolution

This paper uses a modified Markov Diffusion Kernel to derive a variant of GCN called Simple Spectral Graph Convolution (S2GC), and spectral analysis shows that the simple spectral graph convolution used in S2GC is a trade-off of low and high-pass filter bands which capture the global and local contexts of each node.

Non-Recursive Graph Convolutional Networks

This paper proposes a novel architecture named Non-Recursive Graph Convolutional Network (NRGCN) to improve both the training efficiency and the learning performance of GCNs in the context of node classification, and proposes to represent different hops of neighbors for each node based on inner-layer aggregation and layer-independent sampling.

Graph-Revised Convolutional Network

A GCN-based graph revision module is introduced for predicting missing edges and revising edge weights w.r.t. downstream tasks via joint optimization, which shows that GRCN consistently outperforms strong baseline methods by a large margin.

Analysis of Graph Convolutional Networks using Neural Tangent Kernels

This paper derives NTKs corresponding to infinitely wide GCNs with and without skip connections and allowing non-linear output layer, and shows empirically that the approximation is similar to linear output layer.

Keep It Simple: Graph Autoencoders Without Graph Convolutional Networks

This paper proposes to replace the GCN encoder by a simple linear model w.r.t. the adjacency matrix of the graph, and empirically shows that this approach consistently reaches competitive performances on challenging tasks such as link prediction and node clustering.

Addressing Over-Smoothing in Graph Neural Networks via Deep Supervision

It is shown empirically that DSGNNs are resilient to over-smoothing and can outperform competitive benchmarks on node and graph property prediction problems.

Simplified multilayer graph convolutional networks with dropout

This paper presents simplified multilayer graph convolutional networks with dropout (DGCs), novel neural network architectures that successively perform nonlinearity removal and weight matrix merging between graph conventional layers, leveraging a dropout layer to achieve feature augmentation and effectively reduce overfitting.
...

References

SHOWING 1-10 OF 61 REFERENCES

FastGCN: Fast Learning with Graph Convolutional Networks via Importance Sampling

Enhanced with importance sampling, FastGCN not only is efficient for training but also generalizes well for inference, and is orders of magnitude more efficient while predictions remain comparably accurate.

Graph Attention Networks

We present graph attention networks (GATs), novel neural network architectures that operate on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of prior

Deeper Insights into Graph Convolutional Networks for Semi-Supervised Learning

It is shown that the graph convolution of the GCN model is actually a special form of Laplacian smoothing, which is the key reason why GCNs work, but it also brings potential concerns of over-smoothing with many convolutional layers.

Deep Graph Infomax

Deep Graph Infomax (DGI) is presented, a general approach for learning node representations within graph-structured data in an unsupervised manner that is readily applicable to both transductive and inductive learning setups.

Spectral Networks and Locally Connected Networks on Graphs

This paper considers possible generalizations of CNNs to signals defined on more general domains without the action of a translation group, and proposes two constructions, one based upon a hierarchical clustering of the domain, and another based on the spectrum of the graph Laplacian.

Rethinking Knowledge Graph Propagation for Zero-Shot Learning

This work proposes a Dense Graph Propagation module with carefully designed direct links among distant nodes to exploit the hierarchical graph structure of the knowledge graph through additional connections and outperforms state-of-the-art zero-shot learning approaches.

An End-to-End Deep Learning Architecture for Graph Classification

This paper designs a localized graph convolution model and shows its connection with two graph kernels, and designs a novel SortPooling layer which sorts graph vertices in a consistent order so that traditional neural networks can be trained on the graphs.

Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering

This work presents a formulation of CNNs in the context of spectral graph theory, which provides the necessary mathematical background and efficient numerical schemes to design fast localized convolutional filters on graphs.

GaAN: Gated Attention Networks for Learning on Large and Spatiotemporal Graphs

The effectiveness of GaAN on the inductive node classification problem is demonstrated, and the Graph Gated Recurrent Unit (GGRU) is constructed with GaAN as a building block to address the traffic speed forecasting problem.

Diffusion-Convolutional Neural Networks

Through the introduction of a diffusion-convolution operation, it is shown how diffusion-based representations can be learned from graph-structured data and used as an effective basis for node classification.
...