Geometric Graph Representation Learning via Maximizing Rate Reduction

  title={Geometric Graph Representation Learning via Maximizing Rate Reduction},
  author={Xiaotian Han and Zhimeng Jiang and Ninghao Liu and Qingquan Song and Jundong Li and Xia Hu},
  journal={Proceedings of the ACM Web Conference 2022},
Learning discriminative node representations benefits various downstream tasks in graph analysis such as community detection and node classification. Existing graph representation learning methods (e.g., based on random walk and contrastive learning) are limited to maximizing the local similarity of connected nodes. Such pair-wise learning schemes could fail to capture the global distribution of representations, since it has no explicit constraints on the global geometric properties of… 

Figures and Tables from this paper

SCGG: A deep structure-conditioned graph generative model

This paper presents a conditional deep graph generation method called SCGG that considers a particular type of structural conditions, and can address graph completion, a rampant and inherently difficult problem of recovering missing nodes and their associated edges of partially observed graphs.

Tutorial on Deep Learning Interpretation: A Data Perspective

This tutorial introduces recent explanation methods from a data perspective, targeting models that process image data, text data, and graph data, respectively, to reduce the opacity of a model by explaining its behavior, its predictions, or both, thus building trust between human and complex deep learning models.


This work proposes Generalized Demographic Parity (GDP), a group fairness metric for continuous and discrete attributes, and shows the understanding of GDP from the probability perspective and theoretically reveal the connection between GDP regularizer and adversarial debiasing.

G-Mixup: Graph Data Augmentation for Graph Classification

This work proposes G -Mixup to augment graphs for graph classification by interpolating the generator (i.e., graphon) of different classes of graphs by first use graphs within the same class to estimate a graphon, and interpolates graphons of different Classes in the Euclidean space to get mixed graphons.



Deep Gaussian Embedding of Graphs: Unsupervised Inductive Learning via Ranking

Graph2Gauss is proposed - an approach that can efficiently learn versatile node embeddings on large scale (attributed) graphs that show strong performance on tasks such as link prediction and node classification and the benefits of modeling uncertainty are demonstrated.

Graph Representation Learning via Graphical Mutual Information Maximization

An unsupervised learning model trained by maximizing GMI between the input and output of a graph neural encoder is developed, which outperforms state-of-the-art unsuper supervised counterparts, and even sometimes exceeds the performance of supervised ones.

Deep Graph Infomax

Deep Graph Infomax (DGI) is presented, a general approach for learning node representations within graph-structured data in an unsupervised manner that is readily applicable to both transductive and inductive learning setups.

Deep Graph Contrastive Representation Learning

This paper proposes a novel framework for unsupervised graph representation learning by leveraging a contrastive objective at the node level, and generates two graph views by corruption and learns node representations by maximizing the agreement of node representations in these two views.

Graph U-Nets

This work considers the problem of representation learning for graph data and proposes attention-based pooling and unpooling layers, which can better capture graph topology information, and develops an encoder-decoder model known as the graph U-Nets.

Inductive Representation Learning on Large Graphs

GraphSAGE is presented, a general, inductive framework that leverages node feature information (e.g., text attributes) to efficiently generate node embeddings for previously unseen data and outperforms strong baselines on three inductive node-classification benchmarks.

Learning Graph Representations with Embedding Propagation

Embedding Propagation is an unsupervised learning framework for graph-structured data with significantly fewer parameters and hyperparameters that is competitive with and often outperforms state of the art unsuper supervised and semi-supervisedLearning methods on a range of benchmark data sets.

node2vec: Scalable Feature Learning for Networks

In node2vec, an algorithmic framework for learning continuous feature representations for nodes in networks, a flexible notion of a node's network neighborhood is defined and a biased random walk procedure is designed, which efficiently explores diverse neighborhoods.

Learning Diverse and Discriminative Representations via the Principle of Maximal Coding Rate Reduction

Empirically, the representations learned using this principle alone are significantly more robust to label corruptions in classification than those using cross-entropy, and can lead to state-of-the-art results in clustering mixed data from self-learned invariant features.

A Comprehensive Survey on Graph Neural Networks

This article provides a comprehensive overview of graph neural networks (GNNs) in data mining and machine learning fields and proposes a new taxonomy to divide the state-of-the-art GNNs into four categories, namely, recurrent GNNS, convolutional GNN’s, graph autoencoders, and spatial–temporal Gnns.