GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training

@article{Qiu2020GCCGC,
  title={GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training},
  author={Jiezhong Qiu and Qibin Chen and Yuxiao Dong and Jing Zhang and Hongxia Yang and Man-hua Ding and Kuansan Wang and Jie Tang},
  journal={ArXiv},
  year={2020},
  volume={abs/2006.09963}
}
  • Jiezhong Qiu, Qibin Chen, +5 authors Jie Tang
  • Published 2020
  • Computer Science, Mathematics
  • ArXiv
  • Graph representation learning has emerged as a powerful technique for real-world problems. Various downstream graph learning tasks have benefited from its recent developments, such as node classification, similarity search, graph classification, and link prediction. However, prior arts on graph representation learning focus on domain specific problems and train a dedicated model for each graph, which is usually non-transferable to out-of-domain data. Inspired by recent advances in pre-training… CONTINUE READING

    Citations

    Publications citing this paper.

    Self-supervised Learning: Generative or Contrastive

    VIEW 3 EXCERPTS
    CITES BACKGROUND

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 70 REFERENCES

    Deep Graph Kernels

    VIEW 9 EXCERPTS
    HIGHLY INFLUENTIAL

    How Powerful are Graph Neural Networks?

    VIEW 10 EXCERPTS
    HIGHLY INFLUENTIAL

    Representation Learning with Contrastive Predictive Coding

    VIEW 5 EXCERPTS
    HIGHLY INFLUENTIAL

    Attention is All you Need

    VIEW 5 EXCERPTS
    HIGHLY INFLUENTIAL

    DeepWalk: online learning of social representations

    VIEW 5 EXCERPTS
    HIGHLY INFLUENTIAL

    Distributed Representations of Words and Phrases and their Compositionality

    VIEW 7 EXCERPTS
    HIGHLY INFLUENTIAL