• Corpus ID: 219531264

Deep Graph Contrastive Representation Learning

@article{Zhu2020DeepGC,
  title={Deep Graph Contrastive Representation Learning},
  author={Yanqiao Zhu and Yichen Xu and Feng Yu and Q. Liu and Shu Wu and Liang Wang},
  journal={ArXiv},
  year={2020},
  volume={abs/2006.04131}
}
Graph representation learning nowadays becomes fundamental in analyzing graph-structured data. Inspired by recent success of contrastive methods, in this paper, we propose a novel framework for unsupervised graph representation learning by leveraging a contrastive objective at the node level. Specifically, we generate two graph views by corruption and learn node representations by maximizing the agreement of node representations in these two views. To provide diverse node contexts for the… 

Figures and Tables from this paper

Dual Contrastive Attributed Graph Clustering Network
TLDR
Extensive experimental results on four attributed graph datasets show the superiority of DCAGC compared with 16 state-of-the-art clustering methods.
Molecular Graph Contrastive Learning with Parameterized Explainable Augmentations
TLDR
Deep neural networks are applied to parameterize the augmentation process for both the molecular graph topology and atom attributes, to highlight contributive molecular substructures and recognize underlying chemical semantemes.
Self-supervised Consensus Representation Learning for Attributed Graph
TLDR
This work proposes a novel Self-supervised Consensus Representation Learning (SCRL) framework, which treats graph from two perspectives: topology graph and feature graph, and argues that their embeddings should share some common information, which could serve as a supervisory signal.
Simple Unsupervised Graph Representation Learning
TLDR
The proposed multiplet loss explores the complementary information between the structural information and neighbor information to enlarge the inter-class variation, as well as adds an upper bound loss to achieve the finite distance between positive embeddings and anchorembeddings for reducing the intra- class variation.
NCAGC: A Neighborhood Contrast Framework for Attributed Graph Clustering
TLDR
A Neighborhood Contrast Framework for Attributed Graph Clustering, namely NCAGC, seeking for conquering the aforementioned limitations of existing methods.
COSTA: Covariance-Preserving Feature Augmentation for Graph Contrastive Learning
TLDR
It is shown that the node embedding obtained via the graph augmentations is highly biased, somewhat limiting contrastive models from learning discriminative features for downstream tasks, and proposed COSTA, a novel CO variance-pre S erving space Agmentation framework, which generates augmented features by maintaining a “good sketch” of original features.
I'm Me, We're Us, and I'm Us: Tri-directional Contrastive Learning on Hypergraphs
TLDR
This paper proposes TriCon (Tri-directional Contrastive learning), a general framework for contrastive learning on hypergraphs that consistently outperforms not just unsupervised competitors but also (semi-) supervised competitors mostly by significant margins for node classification.
GraphAdaMix: Enhancing Node Representations with Graph Adaptive Mixtures
TLDR
Graph Adaptive Mixtures is presented, a novel approach for learning node representations in a graph by introducing multiple independent GNN models and a trainable mixture distribution for each node that is demonstrated to consistently boost state-of-the-art GNN variants in semi-supervised and unsupervised node classi-cation tasks.
Heterogeneous Graph Neural Networks using Self-supervised Reciprocally Contrastive Learning
TLDR
This work develops for the first time a novel and robust heterogeneous graph contrastive learning approach, namely HGCL, which introduces two views on respective guidance of node attributes and graph topologies and integrates and enhances them by reciprocally contrastive mechanism to better model heterogeneous graphs.
GraFN: Semi-Supervised Node Classification on Graph with Few Labels via Non-Parametric Distribution Assignment
TLDR
A novel semi-supervised method for graphs, GraFN, that leverages few labeled nodes to ensure nodes that belong to the same class to be grouped together, thereby achieving the best of both worlds of semi- supervised and self-super supervised methods.
...
...

References

SHOWING 1-10 OF 53 REFERENCES
On Mutual Information Maximization for Representation Learning
TLDR
This paper argues, and provides empirical evidence, that the success of these methods cannot be attributed to the properties of MI alone, and that they strongly depend on the inductive bias in both the choice of feature extractor architectures and the parametrization of the employed MI estimators.
Deep Graph Infomax
TLDR
Deep Graph Infomax (DGI) is presented, a general approach for learning node representations within graph-structured data in an unsupervised manner that is readily applicable to both transductive and inductive learning setups.
Representation Learning with Contrastive Predictive Coding
TLDR
This work proposes a universal unsupervised learning approach to extract useful representations from high-dimensional data, which it calls Contrastive Predictive Coding, and demonstrates that the approach is able to learn useful representations achieving strong performance on four distinct domains: speech, images, text and reinforcement learning in 3D environments.
Representation Learning on Graphs: Methods and Applications
TLDR
A conceptual review of key advancements in this area of representation learning on graphs, including matrix factorization-based methods, random-walk based algorithms, and graph neural networks are provided.
Semi-Supervised Classification with Graph Convolutional Networks
TLDR
A scalable approach for semi-supervised learning on graph-structured data that is based on an efficient variant of convolutional neural networks which operate directly on graphs which outperforms related methods by a significant margin.
node2vec: Scalable Feature Learning for Networks
TLDR
In node2vec, an algorithmic framework for learning continuous feature representations for nodes in networks, a flexible notion of a node's network neighborhood is defined and a biased random walk procedure is designed, which efficiently explores diverse neighborhoods.
A Simple Framework for Contrastive Learning of Visual Representations
TLDR
It is shown that composition of data augmentations plays a critical role in defining effective predictive tasks, and introducing a learnable nonlinear transformation between the representation and the contrastive loss substantially improves the quality of the learned representations, and contrastive learning benefits from larger batch sizes and more training steps compared to supervised learning.
GaAN: Gated Attention Networks for Learning on Large and Spatiotemporal Graphs
TLDR
The effectiveness of GaAN on the inductive node classification problem is demonstrated, and the Graph Gated Recurrent Unit (GGRU) is constructed with GaAN as a building block to address the traffic speed forecasting problem.
Collective Classification in Network Data
TLDR
This article introduces four of the most widely used inference algorithms for classifying networked data and empirically compare them on both synthetic and real-world data.
Deep Gaussian Embedding of Graphs: Unsupervised Inductive Learning via Ranking
TLDR
Graph2Gauss is proposed - an approach that can efficiently learn versatile node embeddings on large scale (attributed) graphs that show strong performance on tasks such as link prediction and node classification and the benefits of modeling uncertainty are demonstrated.
...
...