• Corpus ID: 235377110

Symmetric Spaces for Graph Embeddings: A Finsler-Riemannian Approach

@inproceedings{Lopez2021SymmetricSF,
  title={Symmetric Spaces for Graph Embeddings: A Finsler-Riemannian Approach},
  author={F. Javier L'opez and B{\'e}atrice Pozzetti and Steve J. Trettel and Michael Strube and Anna Wienhard},
  booktitle={International Conference on Machine Learning},
  year={2021}
}
Learning faithful graph representations as sets of vertex embeddings has become a fundamental intermediary step in a wide range of machine learning applications. We propose the systematic use of symmetric spaces in representation learning, a class encompassing many of the previously used embedding targets. This enables us to introduce a new method, the use of Finsler metrics integrated in a Riemannian optimization scheme, that better adapts to dissimilar structures in the graph. We develop a… 

Small Transformers Compute Universal Metric Embeddings

It is proved that a probabilistic transformer of depth about n log( n ) and width about n 2 can bi-H¨older embed any n -point dataset from X with low metric distortion, thus avoiding the curse of dimensionality.

Pseudo-Riemannian Embedding Models for Multi-Relational Graph Representations

This paper generalizes single-relation pseudo-Riemannian graph embedding models to multi-relational networks, and demonstrates their use in both knowledge graph completion and knowledge discovery in a biological domain.

Non-linear Embeddings in Hilbert Simplex Geometry

The findings demonstrate that Hilbert simplex geometry is competitive to alternative geometries such as the Poincaré hyperbolic ball or the Euclidean geometry for embedding tasks while being fast and numerically robust.

Heterogeneous manifolds for curvature-aware graph embedding

By adding a single extra radial dimension to any given existing homogeneous model, this paper can both account for heterogeneous curvature distributions on graphs and pairwise distances and show its potential in better preservation of high-order structures and heterogeneous random graphs generation.

Identifying latent distances with Finslerian geometry

A new metric is defining as the expected length derived from the stochastic pullback metric, and this metric is compared with a Finsler metric to show that in high dimensions, the metrics converge to each other at a rate of O (cid:0) 1 D ( cid:1) .

Universal Approximation Theorems for Differentiable Geometric Deep Learning

It is shown that the GDL models can approximate any continuous target function uniformly on compact sets of a controlled maximum diameter and there is always a continuous function between any two non-degenerate compact manifolds that any “locally-defined” GDL model cannot uniformly approximate.

Augmenting the User-Item Graph with Textual Similarity Models

A paraphrase similarity model is applied to widely available textual data, such as reviews and product descriptions, yielding new semantic relations that are added to the user-item graph, increasing the density of the graph without needing further labeled data.

Vector-valued Distance and Gyrocalculus on the Space of Symmetric Positive Definite Matrices

We propose the use of the vector-valued distance to compute distances and extract geometric information from the manifold of symmetric positive definite matrices (SPD), and develop gyrovector

References

SHOWING 1-10 OF 58 REFERENCES

Hyperbolic Entailment Cones for Learning Hierarchical Embeddings

This work presents a novel method to embed directed acyclic graphs through hierarchical relations as partial orders defined using a family of nested geodesically convex cones and proves that these entailment cones admit an optimal shape with a closed form expression both in the Euclidean and hyperbolic spaces.

Computationally Tractable Riemannian Manifolds for Graph Embeddings

This work explores two computationally efficient matrix manifolds, showcasing how to learn and optimize graph embeddings in these Riemannian spaces and demonstrates consistent improvements over Euclidean geometry while often outperforming hyperbolic and ellipticalembeddings based on various metrics that capture different graph properties.

Poincaré Embeddings for Learning Hierarchical Representations

This work introduces a new approach for learning hierarchical representations of symbolic data by embedding them into hyperbolic space -- or more precisely into an n-dimensional Poincare ball -- and introduces an efficient algorithm to learn the embeddings based on Riemannian optimization.

Change Detection in Graph Streams by Learning Graph Embeddings on Constant-Curvature Manifolds

A novel change detection framework based on neural networks and CCMs, which takes into account the non-Euclidean nature of graphs is introduced, and the proposed methods are able to detect even small changes in a graph-generating process, consistently outperforming approaches based on Euclidean embeddings.

Neural Embeddings of Graphs in Hyperbolic Space

A new concept that exploits recent insights and proposes learning neural embeddings of graphs in hyperbolic space is presented and experimental evidence that embedding graphs in their natural geometry significantly improves performance on downstream tasks for several real-world public datasets is provided.

Ultrahyperbolic Representation Learning

This paper proposes a representation living on a pseudo-Riemannian manifold with constant nonzero curvature, a generalization of hyperbolic and spherical geometries where the nondegenerate metric tensor is not positive definite.

Manifold structure in graph embeddings

It is shown that existing random graph models, including graphon and other latent position models, predict the data should live near a much lower-dimensional set, and one may circumvent the curse of dimensionality by employing methods which exploit hidden manifold structure.

Learning Continuous Hierarchies in the Lorentz Model of Hyperbolic Geometry

It is shown that an embedding in hyperbolic space can reveal important aspects of a company's organizational structure as well as reveal historical relationships between language families.

Poincaré GloVe: Hyperbolic Word Embeddings

Empirically, based on extensive experiments, it is proved that the embeddings, trained unsupervised, are the first to simultaneously outperform strong and popular baselines on the tasks of similarity, analogy and hypernymy detection.

Representation Tradeoffs for Hyperbolic Embeddings

A hyperbolic generalization of multidimensional scaling (h-MDS), which offers consistently low distortion even with few dimensions across several datasets, is proposed and a PyTorch-based implementation is designed that can handle incomplete information and is scalable.
...