• Corpus ID: 53248796

A General Theory of Equivariant CNNs on Homogeneous Spaces

@article{Cohen2019AGT,
  title={A General Theory of Equivariant CNNs on Homogeneous Spaces},
  author={Taco Cohen and Mario Geiger and Maurice Weiler},
  journal={ArXiv},
  year={2019},
  volume={abs/1811.02017}
}
Group equivariant convolutional neural networks (G-CNNs) have recently emerged as a very effective model class for learning from signals in the context of known symmetries. [...] Key MethodIn addition to this classification, we use Mackey theory to show that convolutions with equivariant kernels are the most general class of equivariant maps between such fields, thus establishing G-CNNs as a universal class of equivariant networks.Expand
Homogeneous vector bundles and G-equivariant convolutional neural networks
TLDR
It is demonstrated that homogeneous vector bundles is the natural setting for GCNNs, and reproducing kernel Hilbert spaces is used to obtain a precise criterion for expressing G-equivariant layers as convolutional layers.
Universal Approximation Theorem for Equivariant Maps by Group CNNs
TLDR
This paper provides a unified method to obtain universal approximation theorems for equivariant maps by CNNs in various settings and can handle non-linear equivariants between infinite-dimensional spaces for non-compact groups.
Geometric Deep Learning and Equivariant Neural Networks
TLDR
The mathematical foundations of geometric deep learning is surveyed, focusing on group equivariant and gaugeEquivariant neural networks and the use of Fourier analysis involving Wigner matrices, spherical harmonics and Clebsch–Gordan coefficients for G = SO(3), illustrating the power of representation theory for deep learning.
General E(2)-Equivariant Steerable CNNs
TLDR
The theory of Steerable CNNs yields constraints on the convolution kernels which depend on group representations describing the transformation laws of feature spaces, and it is shown that these constraints for arbitrary group representations can be reduced to constraints under irreducible representations.
Automatic Symmetry Discovery with Lie Algebra Convolutional Network
TLDR
The Lie algebra convolutional network (L-conv) can automatically discover symmetries and does not require discretization of the group, and it is shown that L-conv can serve as a building block to construct any group equivariant feedforward architecture.
ChebLieNet: Invariant Spectral Graph NNs Turned Equivariant by Riemannian Geometry on Lie Groups
TLDR
The existence of (data-dependent) sweet spots for anisotropic parameters on CIFAR10 is empirically proved, and ChebLieNet, a group-equivariant method on (anisotropic) manifolds is introduced, opening the doors to a better understanding of anisotropies.
VolterraNet: A Higher Order Convolutional Network With Group Equivariance for Homogeneous Manifolds
TLDR
This paper proves that the Volterra functional convolutions are equivariant to the action of the isometry group admitted by the Riemannian homogeneous spaces, and proves that second order functional convolution operations can be represented as cascaded convolutions which leads to an efficient implementation.
Lorentz Group Equivariant Neural Network for Particle Physics
TLDR
A neural network architecture that is fully equivariant with respect to transformations under the Lorentz group, a fundamental symmetry of space and time in physics, leads to drastically simpler models that have relatively few learnable parameters and are much more physically interpretable than leading approaches that use CNNs and point cloud approaches.
A Practical Method for Constructing Equivariant Multilayer Perceptrons for Arbitrary Matrix Groups
TLDR
This work provides a completely general algorithm for solving for the equivariant layers of matrix groups and constructs multilayer perceptrons equivariants to multiple groups that have never been tackled before, including the Rubik’s cube group.
Theoretical Aspects of Group Equivariant Neural Networks
TLDR
This work begins with an exposition of group representation theory and the machinery necessary to define and evaluate integrals and convolutions on groups, and shows applications to recent SO(3) and SE( 3) equivariant networks, namely the Spherical CNNs, Clebsch-Gordan Networks, and 3D Steerable CNNs.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 60 REFERENCES
Intertwiners between Induced Representations (with Applications to the Theory of Equivariant Neural Networks)
TLDR
G-CNNs are established as a universal class of equivariant network architectures on homogeneous spaces like Euclidean space or the sphere if and only if the input and output feature spaces transform according to an induced representation.
On the Generalization of Equivariance and Convolution in Neural Networks to the Action of Compact Groups
TLDR
It is proved that (given some natural constraints) convolutional structure is not just a sufficient, but also a necessary condition for equivariance to the action of a compact group.
General E(2)-Equivariant Steerable CNNs
TLDR
The theory of Steerable CNNs yields constraints on the convolution kernels which depend on group representations describing the transformation laws of feature spaces, and it is shown that these constraints for arbitrary group representations can be reduced to constraints under irreducible representations.
Group Equivariant Convolutional Networks
TLDR
Group equivariant Convolutional Neural Networks (G-CNNs), a natural generalization of convolutional neural networks that reduces sample complexity by exploiting symmetries and achieves state of the art results on CI- FAR10 and rotated MNIST.
3D Steerable CNNs: Learning Rotationally Equivariant Features in Volumetric Data
TLDR
The experimental results confirm the effectiveness of 3D Steerable CNNs for the problem of amino acid propensity prediction and protein structure classification, both of which have inherent SE(3) symmetry.
Universal Invariant and Equivariant Graph Neural Networks
TLDR
The results show that a GNN defined by a single set of parameters can approximate uniformly well a function defined on graphs of varying size.
Gauge Equivariant Convolutional Networks and the Icosahedral CNN
TLDR
Gauge equivariant convolution using a single conv2d call is demonstrated, making it a highly scalable and practical alternative to Spherical CNNs and demonstrating substantial improvements over previous methods on the task of segmenting omnidirectional images and global climate patterns.
Scale-Equivariant Steerable Networks
TLDR
This work pays attention to scale changes, which regularly appear in various tasks due to the changing distances between the objects and the camera, and introduces the general theory for building scale-equivariant convolutional networks with steerable filters.
Deep Scale-spaces: Equivariance Over Scale
TLDR
Deep scale-spaces (DSS) is introduced, a generalization of convolutional neural networks, exploiting the scale symmetry structure of conventional image recognition tasks, and scale equivariant cross-correlations based on a principled extension of convolutions are constructed.
Invariant and Equivariant Graph Networks
TLDR
This paper provides a characterization of all permutation invariant and equivariant linear layers for (hyper-)graph data, and shows that their dimension, in case of edge-value graph data, is 2 and 15, respectively.
...
1
2
3
4
5
...