Corpus ID: 221516279

GraphNorm: A Principled Approach to Accelerating Graph Neural Network Training

  title={GraphNorm: A Principled Approach to Accelerating Graph Neural Network Training},
  author={Tianle Cai and Sheng-Jie Luo and Keyulu Xu and Di He and T. Liu and Liwei Wang},
  • Tianle Cai, Sheng-Jie Luo, +3 authors Liwei Wang
  • Published 2020
  • Computer Science, Mathematics
  • ArXiv
  • Normalization plays an important role in the optimization of deep neural networks. While there are standard normalization methods in computer vision and natural language processing, there is limited understanding of how to effectively normalize neural networks for graph representation learning. In this paper, we propose a principled normalization method, Graph Normalization (GraphNorm), where the key idea is to normalize the feature values across all nodes for each individual graph with a… CONTINUE READING
    2 Citations

    Figures and Tables from this paper.

    Improving Graph Property Prediction with Generalized Readout Functions
    Normalization Techniques in Training DNNs: Methodology, Analysis and Application


    How Powerful are Graph Neural Networks?
    • 852
    • PDF
    Graph Neural Tangent Kernel: Fusing Graph Neural Networks with Graph Kernels
    • 42
    • PDF
    Benchmarking Graph Neural Networks
    • 61
    • PDF
    Graph Attention Networks
    • 2,036
    • PDF
    Geometric Deep Learning on Graphs and Manifolds Using Mixture Model CNNs
    • 712
    • PDF
    Representation Learning on Graphs with Jumping Knowledge Networks
    • 301
    • PDF
    Anonymous Walk Embeddings
    • 75
    • PDF
    Inductive Representation Learning on Large Graphs
    • 2,251
    • PDF
    Norm matters: efficient and accurate normalization schemes in deep networks
    • 90
    • PDF
    Weight Normalization: A Simple Reparameterization to Accelerate Training of Deep Neural Networks
    • 870
    • PDF