• Corpus ID: 246210114

Enhancing Hyperbolic Graph Embeddings via Contrastive Learning

@article{Liu2022EnhancingHG,
  title={Enhancing Hyperbolic Graph Embeddings via Contrastive Learning},
  author={Jiahong Liu and Menglin Yang and Min Zhou and Shanshan Feng and Philippe Fournier-Viger},
  journal={ArXiv},
  year={2022},
  volume={abs/2201.08554}
}
Recently, hyperbolic space has risen as a promising alternative for semi-supervised graph representation learning. Many efforts have been made to design hyperbolic versions of neural network operations. However, the inspiring geometric properties of this unique geometry have not been fully explored yet. The potency of graph models powered by the hyperbolic space is still largely underestimated. Besides, the rich information carried by abundant unlabelled samples is also not well utilized… 

Figures and Tables from this paper

Hyperbolic Curvature Graph Neural Network

A Hyperbolic Curvature-aware Graph Neural Network, HCGNN, is proposed, which utilizes the discrete curvature to lead message passing of the surroundings and adaptively adjust the continuous curvature simultaneously, and outperforms various competitive models by a large margin.

Hyperbolic Graph Neural Networks: A Review of Methods and Applications

This survey comprehensively revisit the technical details of the current hyperbolic graph neural networks, unifying them into a general framework and summarizing the variants of each component, and summarizes a series of related applications in a variety of fields.

HICF: Hyperbolic Informative Collaborative Filtering

A novel learning method, named hyperbolic informative collaborative learning (HICF), aiming to compensate for the recommendation effectiveness of the head item while at the same time improving the performance of the tail item is designed.

BSAL: A Framework of Bi-component Structure and Attribute Learning for Link Prediction

A bicomponent structural and attribute learning framework (BSAL) that is designed to adaptively leverage information from topology and feature spaces is proposed that significantly outperforms baselines on diverse research benchmarks.

Improving Fine-tuning of Self-supervised Models with Contrastive Initialization

This work proposes a Contrastive Initialization (COIN) method, which exploits a supervised contrastive loss to increase inter-class discrepancy and intra-class compactness of features on the target dataset and can be easily trained to discriminate instances of different classes during the final fine-tuning stage.

Hyperbolic Graph Representation Learning: A Tutorial

This tutorial aims to give an introduction to this emerging emerging form of graph representation learning, by comprehensively revisit the hyperbolic embedding techniques includingHyperbolic shallow models and hyperbolicsneural networks, and the technical details of the currenthyperbolicgraph neural networks.

HRCF: Enhancing Collaborative Filtering via Hyperbolic Geometric Regularization

This work brings up a Hyperbolic Regularization powered Collaborative Filtering (HRCF) and design a geometric-aware hyperbolic regularizer, which boosts optimization procedure via the root alignment and origin-aware penalty, which is simple yet impressively effective.

References

SHOWING 1-10 OF 21 REFERENCES

Hyperbolic Graph Convolutional Neural Networks

This work proposes Hyperbolic Graph Convolutional Neural Network (HGCN), the first inductive hyperbolic GCN that leverages both the expressiveness of GCNs andhyperbolic geometry to learn inductive node representations for hierarchical and scale-free graphs.

Deep Graph Contrastive Representation Learning

This paper proposes a novel framework for unsupervised graph representation learning by leveraging a contrastive objective at the node level, and generates two graph views by corruption and learns node representations by maximizing the agreement of node representations in these two views.

Constant Curvature Graph Convolutional Networks

Interest has been rising lately towards methods representing data in non-Euclidean spaces, e.g. hyperbolic or spherical, that provide specific inductive biases useful for certain real-world data

Hyperbolic Graph Attention Network

This article exploits the graph attention network to learn robust node representations of graphs in hyperbolic spaces as the gyrovector space framework provides an elegant algebraic formalism forHyperbolic geometry, and utilizes this framework to learn the graph representations in hyperBolic spaces.

Poincaré Embeddings for Learning Hierarchical Representations

This work introduces a new approach for learning hierarchical representations of symbolic data by embedding them into hyperbolic space -- or more precisely into an n-dimensional Poincare ball -- and introduces an efficient algorithm to learn the embeddings based on Riemannian optimization.

Discrete-time Temporal Network Embedding via Implicit Hierarchical Learning in Hyperbolic Space

Experimental results on multiple real-world datasets demonstrate the superiority of HTGN for temporal graph embedding, as it consistently outperforms competing methods by significant margins in various temporal link prediction tasks.

Hyperbolic Graph Neural Networks

A novel GNN architecture for learning representations on Riemannian manifolds with differentiable exponential and logarithmic maps is proposed and a scalable algorithm for modeling the structural properties of graphs is developed, comparing Euclidean and hyperbolic geometry.

Graph Contrastive Learning with Augmentations

The results show that, even without tuning augmentation extents nor using sophisticated GNN architectures, the GraphCL framework can produce graph representations of similar or better generalizability, transferrability, and robustness compared to state-of-the-art methods.

Contrastive and Generative Graph Convolutional Networks for Graph-based Semi-Supervised Learning

This paper presents a novel GCN-based SSL algorithm which aims to enrich the supervision signals by utilizing both data similarities and graph structure, and designs a semi-supervised contrastive loss related to input features.

Learning Continuous Hierarchies in the Lorentz Model of Hyperbolic Geometry

It is shown that an embedding in hyperbolic space can reveal important aspects of a company's organizational structure as well as reveal historical relationships between language families.