• Corpus ID: 238408146

Turing approximations, toric isometric embeddings & manifold convolutions

@article{SuarezSerrato2021TuringAT,
  title={Turing approximations, toric isometric embeddings \& manifold convolutions},
  author={Pablo Su'arez-Serrato},
  journal={ArXiv},
  year={2021},
  volume={abs/2110.02279}
}
Convolutions are fundamental elements in deep learning architectures. Here, we present a theoretical framework for combining extrinsic and intrinsic approaches to manifold convolution through isometric embeddings into tori. In this way, we define a convolution operator for a manifold of arbitrary topology and dimension. We also explain geometric and topological conditions that make some local definitions of convolutions which rely on translating filters along geodesic paths on a manifold… 

Figures from this paper

References

SHOWING 1-10 OF 58 REFERENCES
On the Generalization of Equivariance and Convolution in Neural Networks to the Action of Compact Groups
TLDR
It is proved that (given some natural constraints) convolutional structure is not just a sufficient, but also a necessary condition for equivariance to the action of a compact group.
Geodesic Convolutional Neural Networks on Riemannian Manifolds
TLDR
Geodesic Convolutional Neural Networks (GCNN), a generalization of the convolutional neural networks (CNN) paradigm to non-Euclidean manifolds is introduced, allowing to achieve state-of-the-art performance in problems such as shape description, retrieval, and correspondence.
Multi-directional geodesic neural networks via equivariant convolution
TLDR
This work defines directional convolution in the continuous setting, proves its key properties and shows how it can be implemented in practice, for shapes represented as triangle meshes, where it shows a significant improvement over several baselines.
Embeddings of Riemannian manifolds with finite eigenvector fields of connection Laplacian
TLDR
Eigenvector fields are used to construct local coordinate charts with low distortion, and it is shown that the distortion constants depend only on geometric properties of manifolds with metrics in the little Hölder space.
Isometric Embeddings via Heat Kernel
We combine the heat kernel embedding and Gunther’s implicit function theorem to obtain isometric embeddings of compact Einstein manifolds into Euclidean spaces. As the heat ‡ow time t ! 0+, the
Convergence of the reach for a sequence of Gaussian-embedded manifolds
Motivated by questions of manifold learning, we study a sequence of random manifolds, generated by embedding a fixed, compact manifold M into Euclidean spheres of increasing dimension via a sequence
Embeddings of Riemannian Manifolds with Heat Kernels and Eigenfunctions
We show that any closed n‐dimensional Riemannian manifold can be embedded by a map constructed from heat kernels at a certain time from a finite number of points. Both this time and this number can
Distance Preserving Embeddings for General n-Dimensional Manifolds
TLDR
Two algorithms that embed a general n-dimensionalmanifold into Rd (where d only depends on some key manifold properties such as its intrinsic dimension, volume and curvature) that guarantee to approximately preserve all interpoint geodesic distances are presented.
Gauge Equivariant Convolutional Networks and the Icosahedral CNN
TLDR
Gauge equivariant convolution using a single conv2d call is demonstrated, making it a highly scalable and practical alternative to Spherical CNNs and demonstrating substantial improvements over previous methods on the task of segmenting omnidirectional images and global climate patterns.
Spectral Networks and Locally Connected Networks on Graphs
TLDR
This paper considers possible generalizations of CNNs to signals defined on more general domains without the action of a translation group, and proposes two constructions, one based upon a hierarchical clustering of the domain, and another based on the spectrum of the graph Laplacian.
...
...