Hyperbolic Graph Convolutional Neural Networks
@article{Chami2019HyperbolicGC, title={Hyperbolic Graph Convolutional Neural Networks}, author={Ines Chami and Rex Ying and Christopher R{\'e} and Jure Leskovec}, journal={Advances in neural information processing systems}, year={2019}, volume={32}, pages={ 4869-4880 } }
Graph convolutional neural networks (GCNs) embed nodes in a graph into Euclidean space, which has been shown to incur a large distortion when embedding real-world graphs with scale-free or hierarchical structure. Hyperbolic geometry offers an exciting alternative, as it enables embeddings with much smaller distortion. However, extending GCNs to hyperbolic geometry presents several unique challenges because it is not clear how to define neural network operations, such as feature transformation…
Figures and Tables from this paper
253 Citations
A Hyperbolic-to-Hyperbolic Graph Convolutional Network
- Computer Science, Mathematics2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
- 2021
A manifold-preserving graph convolution that consists of ahyperbolic feature transformation and a hyperbolic neighborhood aggregation and achieves substantial improvements on the link prediction, node classification, and graph classification tasks.
Laplacian Features for Learning with Hyperbolic Space
- Computer Science
- 2022
HyLa is proposed, a simple and minimal approach to using hyperbolic space in networks: HyLa maps once from ahyperbolic-space embedding to Euclidean space via the eigenfunctions of the Laplacian operator in the hyperbolics space.
Lorentzian Graph Convolutional Networks
- Computer Science, MathematicsWWW
- 2021
This paper rebuilds the graph operations of hyperbolic GCNs with Lorentzian version, e.g., the feature transformation and non-linear activation, and proves some proposed graph operations are equivalent in different types ofhyperbolic geometry, which fundamentally indicates their correctness.
HyLa: Hyperbolic Laplacian Features For Graph Learning
- Computer ScienceArXiv
- 2022
This paper proposes HyLa, a completely different approach to using hyperbolic space in graph learning: HyLa maps once from a learnedhyperbolic-space embedding to Euclidean space via the eigenfunctions of the Laplacian operator in the hyperBolic space.
Enhancing Hyperbolic Graph Embeddings via Contrastive Learning
- Computer ScienceArXiv
- 2022
A novel Hyperbolic Graph Contrastive Learning (HGCL) framework is put forward which learns node representations through multiple hyperbolic spaces to implicitly capture the hierarchical structure shared between different views and demonstrates the superiority of the proposed HGCL as it consistently outperforms competing methods by considerable margins for the node classi-cation task.
HEAT: Hyperbolic Embedding of Attributed Networks
- Computer ScienceIDEAL
- 2020
HEAT (Hyperbolic Embedding of ATributed networks), the first method for embedding attributed networks to a hyperbolic space, is introduced and shows that by leveraging node attributes, HEAT can outperform a state-of-the-art Hyperbolic embedding algorithm on several downstream tasks.
Trivial bundle embeddings for learning graph representations
- Computer ScienceArXiv
- 2021
An inductive model is proposed that leverages both the expressiveness of GCNs and trivial bundle to learn inductive node representations for networks with or without node features and reduces errors for link prediction and node classification when compared to the Euclidean and hyperbolic GCNs.
Hyperbolic Graph Attention Network
- Computer Science, MathematicsIEEE Transactions on Big Data
- 2021
The comprehensive experimental results on four real-world datasets demonstrate the performance of the proposed hyperbolic graph attention network model, by comparisons with other state-of-the-art baseline methods.
Model-independent methods for embedding directed networks into Euclidean and hyperbolic spaces
- Computer Science
- 2022
A framework based on the dimension reduction of proximity matrices reflecting the network topology, coupled with a general conversion method transforming Euclidean node coordinates into hyperbolic ones even for directed networks is proposed.
Constant Curvature Graph Convolutional Networks
- Computer Science, MathematicsICML
- 2020
Interest has been rising lately towards methods representing data in non-Euclidean spaces, e.g. hyperbolic or spherical, that provide specific inductive biases useful for certain real-world data…
References
SHOWING 1-10 OF 52 REFERENCES
Neural Embeddings of Graphs in Hyperbolic Space
- Computer ScienceArXiv
- 2017
A new concept that exploits recent insights and proposes learning neural embeddings of graphs in hyperbolic space is presented and experimental evidence that embedding graphs in their natural geometry significantly improves performance on downstream tasks for several real-world public datasets is provided.
Hyperbolic Graph Neural Networks
- Computer ScienceNeurIPS
- 2019
A novel GNN architecture for learning representations on Riemannian manifolds with differentiable exponential and logarithmic maps is proposed and a scalable algorithm for modeling the structural properties of graphs is developed, comparing Euclidean and hyperbolic geometry.
Hyperbolic Attention Networks
- Computer ScienceICLR
- 2019
This work introduces hyperbolic attention networks to endow neural networks with enough capacity to match the complexity of data with hierarchical and power-law structure and re-expressing the ubiquitous mechanism of soft attention in terms of operations defined for hyperboloid and Klein models.
Poincaré Embeddings for Learning Hierarchical Representations
- Computer ScienceNIPS
- 2017
This work introduces a new approach for learning hierarchical representations of symbolic data by embedding them into hyperbolic space -- or more precisely into an n-dimensional Poincare ball -- and introduces an efficient algorithm to learn the embeddings based on Riemannian optimization.
Hyperbolic Entailment Cones for Learning Hierarchical Embeddings
- Computer ScienceICML
- 2018
This work presents a novel method to embed directed acyclic graphs through hierarchical relations as partial orders defined using a family of nested geodesically convex cones and proves that these entailment cones admit an optimal shape with a closed form expression both in the Euclidean and hyperbolic spaces.
Representation Tradeoffs for Hyperbolic Embeddings
- Computer ScienceICML
- 2018
A hyperbolic generalization of multidimensional scaling (h-MDS), which offers consistently low distortion even with few dimensions across several datasets, is proposed and a PyTorch-based implementation is designed that can handle incomplete information and is scalable.
Poincaré GloVe: Hyperbolic Word Embeddings
- Computer ScienceICLR
- 2019
Empirically, based on extensive experiments, it is proved that the embeddings, trained unsupervised, are the first to simultaneously outperform strong and popular baselines on the tasks of similarity, analogy and hypernymy detection.
How Powerful are Graph Neural Networks?
- Computer ScienceICLR
- 2019
This work characterize the discriminative power of popular GNN variants, such as Graph Convolutional Networks and GraphSAGE, and show that they cannot learn to distinguish certain simple graph structures, and develops a simple architecture that is provably the most expressive among the class of GNNs.
Hyperbolic Geometry of Complex Networks
- Computer Science, MathematicsPhysical review. E, Statistical, nonlinear, and soft matter physics
- 2010
It is shown that targeted transport processes without global topology knowledge are maximally efficient, according to all efficiency measures, in networks with strongest heterogeneity and clustering, and that this efficiency is remarkably robust with respect to even catastrophic disturbances and damages to the network structure.
Inductive Representation Learning on Large Graphs
- Computer ScienceNIPS
- 2017
GraphSAGE is presented, a general, inductive framework that leverages node feature information (e.g., text attributes) to efficiently generate node embeddings for previously unseen data and outperforms strong baselines on three inductive node-classification benchmarks.