Corpus ID: 236447721

CCGL: Contrastive Cascade Graph Learning

  title={CCGL: Contrastive Cascade Graph Learning},
  author={Xovee Xu and Fan Zhou and Kunpeng Zhang and Siyuan Liu},
  • Xovee Xu, Fan Zhou, +1 author Siyuan Liu
  • Published 2021
  • Computer Science
  • ArXiv
Supervised learning, while prevalent for information cascade modeling, often requires abundant labeled data in training, and the trained model is not easy to generalize across tasks and datasets. Semi-supervised learning facilitates unlabeled data for cascade understanding in pre-training. It often learns fine-grained feature-level representations, which can easily result in overfitting for downstream tasks. Recently, contrastive self-supervised learning is designed to alleviate these two… Expand


Strategies for Pre-training Graph Neural Networks
A new strategy and self-supervised methods for pre-training Graph Neural Networks (GNNs) that avoids negative transfer and improves generalization significantly across downstream tasks, leading up to 9.4% absolute improvements in ROC-AUC over non-pre-trained models and achieving state-of-the-art performance for molecular property prediction and protein function prediction. Expand
GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training
Graph Contrastive Coding (GCC) is designed --- a self-supervised graph neural network pre-training framework --- to capture the universal network topological properties across multiple networks and leverage contrastive learning to empower graph neural networks to learn the intrinsic and transferable structural representations. Expand
GPT-GNN: Generative Pre-Training of Graph Neural Networks
The GPT-GNN framework to initialize GNNs by generative pre-training introduces a self-supervised attributed graph generation task to pre-train a GNN so that it can capture the structural and semantic properties of the graph. Expand
DropEdge: Towards Deep Graph Convolutional Networks on Node Classification
DropEdge is a general skill that can be equipped with many other backbone models (e.g. GCN, ResGCN, GraphSAGE, and JKNet) for enhanced performance and consistently improves the performance on a variety of both shallow and deep GCNs. Expand
Unsupervised Feature Learning via Non-parametric Instance Discrimination
This work forms this intuition as a non-parametric classification problem at the instance-level, and uses noise-contrastive estimation to tackle the computational challenges imposed by the large number of instance classes. Expand
Deep Graph Infomax
Deep Graph Infomax (DGI) is presented, a general approach for learning node representations within graph-structured data in an unsupervised manner that is readily applicable to both transductive and inductive learning setups. Expand
Unsupervised Domain Adaptive Graph Convolutional Networks
A novel approach, unsupervised domain adaptive graph convolutional networks (UDA-GCN), for domain adaptation learning for graphs, which jointly exploits local and global consistency for feature aggregation and facilitates knowledge transfer between graphs. Expand
node2vec: Scalable Feature Learning for Networks
In node2vec, an algorithmic framework for learning continuous feature representations for nodes in networks, a flexible notion of a node's network neighborhood is defined and a biased random walk procedure is designed, which efficiently explores diverse neighborhoods. Expand
Representation Learning with Contrastive Predictive Coding
This work proposes a universal unsupervised learning approach to extract useful representations from high-dimensional data, which it calls Contrastive Predictive Coding, and demonstrates that the approach is able to learn useful representations achieving strong performance on four distinct domains: speech, images, text and reinforcement learning in 3D environments. Expand
Contrastive Multi-View Representation Learning on Graphs
We introduce a self-supervised approach for learning node and graph level representations by contrasting structural views of graphs. We show that unlike visual representation learning, increasing theExpand