Iterative Graph Self-Distillation
@article{Zhang2020IterativeGS, title={Iterative Graph Self-Distillation}, author={Hanlin Zhang and Shuai Lin and Weiyang Liu and Pan Zhou and Jian Tang and Xiaodan Liang and Eric P. Xing}, journal={ArXiv}, year={2020}, volume={abs/2010.12609} }
How to discriminatively vectorize graphs is a fundamental challenge that attracts increasing attentions in recent years. Inspired by the recent success of unsupervised contrastive learning, we aim to learn graph-level representation in an unsupervised manner. Specifically, we propose a novel unsupervised graph learning paradigm called Iterative Graph Self-Distillation (IGSD) which iteratively performs the teacher-student distillation with graph augmentations. Different from conventional…
22 Citations
On Representation Knowledge Distillation for Graph Neural Networks
- Computer ScienceIEEE transactions on neural networks and learning systems
- 2022
Experiments show that G-CRD consistently boosts the performance and robustness of lightweight GNNs, outperforming LSP (and a global structure preserving (GSP) variant of LSP) as well as baselines from 2-D computer vision.
Graph Contrastive Learning Automated
- Computer ScienceICML
- 2021
A unified bilevel optimization framework to automatically, adaptively and dynamically select data augmentations when performing GraphCL on specific graph data is proposed and a new augmentationaware projection head mechanism, which will route output features through different projection heads corresponding to different augmentations chosen at each training step is proposed.
From Canonical Correlation Analysis to Self-supervised Graph Neural Networks
- Computer ScienceNeurIPS
- 2021
A conceptually simple yet effective model for self-supervised representation learning with graph data that aims at discarding augmentation-variant information by learning invariant representations and can prevent degenerated solutions by decorrelating features in different dimensions is introduced.
Simple Unsupervised Graph Representation Learning
- Computer ScienceAAAI
- 2022
The proposed multiplet loss explores the complementary information between the structural information and neighbor information to enlarge the inter-class variation, as well as adds an upper bound loss to achieve the finite distance between positive embeddings and anchorembeddings for reducing the intra- class variation.
Graph Self-Supervised Learning: A Survey
- Computer ScienceIEEE Transactions on Knowledge and Data Engineering
- 2022
A timely and comprehensive review of the existing approaches which employ SSL techniques for graph data and a unified framework that mathematically formalizes the paradigm of graph SSL is constructed.
Features Based Adaptive Augmentation for Graph Contrastive Learning
- Computer ScienceArXiv
- 2022
A Feature Based Adaptive Augmentation approach, which identifies and preserves potentially influential features and corrupts the remaining ones and improves the accuracy of GRACE and BGRL on eight graph representation learning’s benchmark datasets.
Graph Representation Learning Through Recoverability
- Computer Science
- 2022
This work proposes a self-supervised graph representation learning algorithm, Graph Information Representation Learning (GIRL), which is based on an alternative information metric, recoverability, which is tightly related to mutual information but is less complicated when estimating.
JGCL: Joint Self-Supervised and Supervised Graph Contrastive Learning
- Computer ScienceWWW
- 2022
This work proposes a joint self-supervised and supervised graph contrastive learning (JGCL) to capture the mutual benefits of both learning strategies and demonstrates that JGCL and its variants are one of the best performers across various proportions of labeled data when compared with state-of-the-art self- Supervised, unsupervised, and semi-super supervised methods on various benchmark graphs.
Automated Graph Self-supervised Learning via Multi-teacher Knowledge Distillation
- Computer ScienceArXiv
- 2022
This paper proposes a novel multi-teacher knowledge distillation framework for Automated Graph Self-Supervised Learning ( AGSSL), which consists of two main branches: Knowledge Extraction and Knowledge Integration.
Graph Representation Learning via Aggregation Enhancement
- Computer Science
- 2022
This work highlights the potential of KR to advance the field of graph representation learning and enhance the performance of GNNs, using KR loss as the primary loss in self-supervised settings or as a regularization term in supervised settings.
References
SHOWING 1-10 OF 56 REFERENCES
Graph Barlow Twins: A self-supervised representation learning framework for graphs
- Computer ScienceKnowl. Based Syst.
- 2022
Graph Contrastive Learning with Augmentations
- Computer ScienceNeurIPS
- 2020
The results show that, even without tuning augmentation extents nor using sophisticated GNN architectures, the GraphCL framework can produce graph representations of similar or better generalizability, transferrability, and robustness compared to state-of-the-art methods.
Deep Graph Infomax
- Computer ScienceICLR
- 2019
Deep Graph Infomax (DGI) is presented, a general approach for learning node representations within graph-structured data in an unsupervised manner that is readily applicable to both transductive and inductive learning setups.
GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training
- Computer ScienceKDD
- 2020
Graph Contrastive Coding (GCC) is designed --- a self-supervised graph neural network pre-training framework --- to capture the universal network topological properties across multiple networks and leverage contrastive learning to empower graph neural networks to learn the intrinsic and transferable structural representations.
Contrastive Multi-View Representation Learning on Graphs
- Computer ScienceICML
- 2020
We introduce a self-supervised approach for learning node and graph level representations by contrasting structural views of graphs. We show that unlike visual representation learning, increasing the…
graph2vec: Learning Distributed Representations of Graphs
- Computer ScienceArXiv
- 2017
This work proposes a neural embedding framework named graph2vec to learn data-driven distributed representations of arbitrary sized graphs that achieves significant improvements in classification and clustering accuracies over substructure representation learning approaches and are competitive with state-of-the-art graph kernels.
When Does Self-Supervision Help Graph Convolutional Networks?
- Computer ScienceICML
- 2020
The first systematic exploration and assessment of incorporating self-supervision into graph convolutional networks (GCNs) and results show that, with properly designed task forms and incorporation mechanisms, self- supervision benefits GCNs in gaining more generalizability and robustness.
Strategies for Pre-training Graph Neural Networks
- Computer ScienceICLR
- 2020
A new strategy and self-supervised methods for pre-training Graph Neural Networks (GNNs) that avoids negative transfer and improves generalization significantly across downstream tasks, leading up to 9.4% absolute improvements in ROC-AUC over non-pre-trained models and achieving state-of-the-art performance for molecular property prediction and protein function prediction.
How Powerful are Graph Neural Networks?
- Computer ScienceICLR
- 2019
This work characterize the discriminative power of popular GNN variants, such as Graph Convolutional Networks and GraphSAGE, and show that they cannot learn to distinguish certain simple graph structures, and develops a simple architecture that is provably the most expressive among the class of GNNs.
Data Augmentation for Graph Neural Networks
- Computer ScienceAAAI
- 2021
This work shows that neural edge predictors can effectively encode class-homophilic structure to promote intra- class edges and demote inter-class edges in given graph structure, and introduces the GAug graph data augmentation framework, which leverages these insights to improve performance in GNN-based node classification via edge prediction.