Corpus ID: 221186968

Training Matters: Unlocking Potentials of Deeper Graph Convolutional Neural Networks

@article{Luan2020TrainingMU,
  title={Training Matters: Unlocking Potentials of Deeper Graph Convolutional Neural Networks},
  author={Sitao Luan and Mingde Zhao and Xiao-Wen Chang and Doina Precup},
  journal={ArXiv},
  year={2020},
  volume={abs/2008.08838}
}
The performance limit of Graph Convolutional Networks (GCNs) and the fact that we cannot stack more of them to increase the performance, which we usually do for other deep learning paradigms, are pervasively thought to be caused by the limitations of the GCN layers, including insufficient expressive power, etc. However, if so, for a fixed architecture, it would be unlikely to lower the training difficulty and to improve performance by changing only the training procedure, which we show in this… Expand

References

SHOWING 1-10 OF 33 REFERENCES
Break the Ceiling: Stronger Multi-scale Deep Graph Convolutional Networks
TLDR
This paper generalizes spectral graph convolution and deep GCN in block Krylov subspace forms and devise two architectures, both with the potential to be scaled deeper but each making use of the multi-scale information in different ways. Expand
FastGCN: Fast Learning with Graph Convolutional Networks via Importance Sampling
TLDR
Enhanced with importance sampling, FastGCN not only is efficient for training but also generalizes well for inference, and is orders of magnitude more efficient while predictions remain comparably accurate. Expand
DeepGCNs: Can GCNs Go As Deep As CNNs?
TLDR
This work presents new ways to successfully train very deep GCNs by borrowing concepts from CNNs, specifically residual/dense connections and dilated convolutions, and adapting them to GCN architectures, and building a very deep 56-layer GCN. Expand
Deeper Insights into Graph Convolutional Networks for Semi-Supervised Learning
TLDR
It is shown that the graph convolution of the GCN model is actually a special form of Laplacian smoothing, which is the key reason why GCNs work, but it also brings potential concerns of over-smoothing with many convolutional layers. Expand
Can GCNs Go as Deep as CNNs?
TLDR
This work presents new ways to successfully train very deep GCNs, and borrow concepts from CNNs, mainly residual/dense connections and dilated convolutions, and adapt them to GCN architectures and builds a very deep 56-layer GCN. Expand
Stochastic Training of Graph Convolutional Networks with Variance Reduction
TLDR
Control variate based algorithms which allow sampling an arbitrarily small neighbor size are developed and a new theoretical guarantee for these algorithms to converge to a local optimum of GCN is proved. Expand
Stochastic Training of Graph Convolutional Networks
TLDR
A preprocessing strategy and two control variate based algorithms to further reduce the receptive field size of graph convolutional networks and are guaranteed to converge to GCN's local optimum regardless of the neighbor sampling size. Expand
Weight Normalization: A Simple Reparameterization to Accelerate Training of Deep Neural Networks
TLDR
A reparameterization of the weight vectors in a neural network that decouples the length of those weight vectors from their direction is presented, improving the conditioning of the optimization problem and speeding up convergence of stochastic gradient descent. Expand
Understanding the difficulty of training deep feedforward neural networks
TLDR
The objective here is to understand better why standard gradient descent from random initialization is doing so poorly with deep neural networks, to better understand these recent relative successes and help design better algorithms in the future. Expand
Deep Residual Learning for Image Recognition
TLDR
This work presents a residual learning framework to ease the training of networks that are substantially deeper than those used previously, and provides comprehensive empirical evidence showing that these residual networks are easier to optimize, and can gain accuracy from considerably increased depth. Expand
...
1
2
3
4
...