Geometric Graph Representation Learning via Maximizing Rate Reduction
@article{Han2022GeometricGR, title={Geometric Graph Representation Learning via Maximizing Rate Reduction}, author={Xiaotian Han and Zhimeng Jiang and Ninghao Liu and Qingquan Song and Jundong Li and Xia Hu}, journal={Proceedings of the ACM Web Conference 2022}, year={2022} }
Learning discriminative node representations benefits various downstream tasks in graph analysis such as community detection and node classification. Existing graph representation learning methods (e.g., based on random walk and contrastive learning) are limited to maximizing the local similarity of connected nodes. Such pair-wise learning schemes could fail to capture the global distribution of representations, since it has no explicit constraints on the global geometric properties of…
Figures and Tables from this paper
4 Citations
SCGG: A deep structure-conditioned graph generative model
- Computer SciencePloS one
- 2022
This paper presents a conditional deep graph generation method called SCGG that considers a particular type of structural conditions, and can address graph completion, a rampant and inherently difficult problem of recovering missing nodes and their associated edges of partially observed graphs.
Tutorial on Deep Learning Interpretation: A Data Perspective
- Computer ScienceCIKM
- 2022
This tutorial introduces recent explanation methods from a data perspective, targeting models that process image data, text data, and graph data, respectively, to reduce the opacity of a model by explaining its behavior, its predictions, or both, thus building trust between human and complex deep learning models.
G ENERALIZED D EMOGRAPHIC P ARITY FOR G ROUP F AIRNESS
- Computer Science
- 2022
This work proposes Generalized Demographic Parity (GDP), a group fairness metric for continuous and discrete attributes, and shows the understanding of GDP from the probability perspective and theoretically reveal the connection between GDP regularizer and adversarial debiasing.
G-Mixup: Graph Data Augmentation for Graph Classification
- Computer Science, MathematicsICML
- 2022
This work proposes G -Mixup to augment graphs for graph classification by interpolating the generator (i.e., graphon) of different classes of graphs by first use graphs within the same class to estimate a graphon, and interpolates graphons of different Classes in the Euclidean space to get mixed graphons.
References
SHOWING 1-10 OF 53 REFERENCES
Deep Gaussian Embedding of Graphs: Unsupervised Inductive Learning via Ranking
- Computer ScienceICLR
- 2018
Graph2Gauss is proposed - an approach that can efficiently learn versatile node embeddings on large scale (attributed) graphs that show strong performance on tasks such as link prediction and node classification and the benefits of modeling uncertainty are demonstrated.
Graph Representation Learning via Graphical Mutual Information Maximization
- Computer ScienceWWW
- 2020
An unsupervised learning model trained by maximizing GMI between the input and output of a graph neural encoder is developed, which outperforms state-of-the-art unsuper supervised counterparts, and even sometimes exceeds the performance of supervised ones.
Deep Graph Infomax
- Computer ScienceICLR
- 2019
Deep Graph Infomax (DGI) is presented, a general approach for learning node representations within graph-structured data in an unsupervised manner that is readily applicable to both transductive and inductive learning setups.
Deep Graph Contrastive Representation Learning
- Computer ScienceArXiv
- 2020
This paper proposes a novel framework for unsupervised graph representation learning by leveraging a contrastive objective at the node level, and generates two graph views by corruption and learns node representations by maximizing the agreement of node representations in these two views.
Graph U-Nets
- Computer ScienceIEEE Transactions on Pattern Analysis and Machine Intelligence
- 2022
This work considers the problem of representation learning for graph data and proposes attention-based pooling and unpooling layers, which can better capture graph topology information, and develops an encoder-decoder model known as the graph U-Nets.
Inductive Representation Learning on Large Graphs
- Computer ScienceNIPS
- 2017
GraphSAGE is presented, a general, inductive framework that leverages node feature information (e.g., text attributes) to efficiently generate node embeddings for previously unseen data and outperforms strong baselines on three inductive node-classification benchmarks.
Learning Graph Representations with Embedding Propagation
- Computer ScienceNIPS
- 2017
Embedding Propagation is an unsupervised learning framework for graph-structured data with significantly fewer parameters and hyperparameters that is competitive with and often outperforms state of the art unsuper supervised and semi-supervisedLearning methods on a range of benchmark data sets.
node2vec: Scalable Feature Learning for Networks
- Computer ScienceKDD
- 2016
In node2vec, an algorithmic framework for learning continuous feature representations for nodes in networks, a flexible notion of a node's network neighborhood is defined and a biased random walk procedure is designed, which efficiently explores diverse neighborhoods.
Learning Diverse and Discriminative Representations via the Principle of Maximal Coding Rate Reduction
- Computer ScienceNeurIPS
- 2020
Empirically, the representations learned using this principle alone are significantly more robust to label corruptions in classification than those using cross-entropy, and can lead to state-of-the-art results in clustering mixed data from self-learned invariant features.
A Comprehensive Survey on Graph Neural Networks
- Computer ScienceIEEE Transactions on Neural Networks and Learning Systems
- 2019
This article provides a comprehensive overview of graph neural networks (GNNs) in data mining and machine learning fields and proposes a new taxonomy to divide the state-of-the-art GNNs into four categories, namely, recurrent GNNS, convolutional GNN’s, graph autoencoders, and spatial–temporal Gnns.