# A Practical Method for Constructing Equivariant Multilayer Perceptrons for Arbitrary Matrix Groups

@article{Finzi2021APM, title={A Practical Method for Constructing Equivariant Multilayer Perceptrons for Arbitrary Matrix Groups}, author={Marc Finzi and Max Welling and Andrew Gordon Wilson}, journal={ArXiv}, year={2021}, volume={abs/2104.09459} }

Symmetries and equivariance are fundamental to the generalization of neural networks on domains such as images, graphs, and point clouds. Existing work has primarily focused on a small number of groups, such as the translation, rotation, and permutation groups. In this work we provide a completely general algorithm for solving for the equivariant layers of matrix groups. In addition to recovering solutions from other works as special cases, we construct multilayer perceptrons equivariant to…

## Figures and Tables from this paper

## 46 Citations

Geometric Deep Learning and Equivariant Neural Networks

- MathematicsArXiv
- 2021

The mathematical foundations of geometric deep learning is surveyed, focusing on group equivariant and gaugeEquivariant neural networks and the use of Fourier analysis involving Wigner matrices, spherical harmonics and Clebsch–Gordan coefficients for G = SO(3), illustrating the power of representation theory for deep learning.

Automatic Symmetry Discovery with Lie Algebra Convolutional Network

- Mathematics, Computer ScienceNeurIPS
- 2021

The Lie algebra convolutional network (L-conv) can automatically discover symmetries and does not require discretization of the group, and it is shown that L-conv can serve as a building block to construct any group equivariant feedforward architecture.

Scalars are universal: Equivariant machine learning, structured like classical physics

- Computer Science, PhysicsNeurIPS
- 2021

It is shown that it is simple to parameterize universally approximating polynomial functions that are equivariant under these symmetries, or under the Euclidean, Lorentz, and Poincaré groups, at any dimensionality d.

Unified Fourier-based Kernel and Nonlinearity Design for Equivariant Networks on Homogeneous Spaces

- MathematicsICML
- 2022

We introduce a uniﬁed framework for group equivariant networks on homogeneous spaces derived from a Fourier perspective. We consider tensor-valued feature ﬁelds, before and after a convolutional…

Scalars are universal: Gauge-equivariant machine learning, structured like classical physics

- Physics, Computer ScienceArXiv
- 2021

It is shown that it is simple to parameterize universally approximating polynomial functions that are equivariant under these symmetries, or under the Euclidean, Lorentz, and Poincaré groups, at any dimensionality d.

Subgraph Permutation Equivariant Networks

- Computer Science
- 2022

In this work we develop a new method, named Sub-graph Permutation Equivariant Networks (SPEN), which provides a framework for building graph neural networks that operate on sub-graphs, while using…

Learning Equivariances and Partial Equivariances from Data

- Computer Science, MathematicsArXiv
- 2021

Partial G-CNNs are introduced: a family of equivariant networks able to learn partial and full equivariances from data at every layer end-to-end and perform on par with G- CNNs when full equivariance is necessary, and outperform them otherwise.

Symmetry-driven graph neural networks

- Computer ScienceArXiv
- 2021

Two graph network architectures that are equivariant to several types of transformations affecting the node coordinates are introduced that can be vastly more data efficient with respect to classical graph architectures, intrinsically equipped with a better inductive bias and better at generalising.

Geometric Algebra Attention Networks for Small Point Clouds

- Computer ScienceArXiv
- 2021

The geometric algebra provides valuable mathematical structure by which to combine vector, scalar, and other types of geometric inputs in a systematic way to account for rotation invariance or covariance, while attention yields a powerful way to impose permutation equivariance.

ZZ-Net: A Universal Rotation Equivariant Architecture for 2D Point Clouds

- Computer Science, MathematicsArXiv
- 2021

A novel neural network architecture is proposed for processing 2D point clouds and its universality for approximating functions exhibiting any continuous rotation equivariant and permutation invariant function is proved.

## References

SHOWING 1-10 OF 45 REFERENCES

Generalizing Convolutional Neural Networks for Equivariance to Lie Groups on Arbitrary Continuous Data

- MathematicsICML
- 2020

A general method to construct a convolutional layer that is equivariant to transformations from any specified Lie group with a surjective exponential map is proposed, enabling rapid prototyping and exact conservation of linear and angular momentum.

Equivariant Maps for Hierarchical Structures

- Computer ScienceArXiv
- 2020

This work demonstrates the effectiveness of a hierarchy of translation and permutation symmetries for learning on point cloud data, and reports state-of-the-art onsemantic3d and s3dis, two of the largest real-world benchmarks for 3D semantic segmentation.

Universal Equivariant Multilayer Perceptrons

- Computer ScienceICML
- 2020

It is shown that having a hidden layer on which the group acts regularly is sufficient for universal equivariance (invariance) and the universality of a broad class of equivariant MLPs with a single hidden layer is proved.

On the Universality of Rotation Equivariant Point Cloud Networks

- Computer ScienceICLR
- 2021

A first study of the approximation power of neural architectures that are invariant or equivariant to all three shape-preserving transformations of point clouds: translation, rotation, and permutation is presented.

Lorentz Group Equivariant Neural Network for Particle Physics

- Physics, Computer ScienceICML
- 2020

A neural network architecture that is fully equivariant with respect to transformations under the Lorentz group, a fundamental symmetry of space and time in physics, leads to drastically simpler models that have relatively few learnable parameters and are much more physically interpretable than leading approaches that use CNNs and point cloud approaches.

General E(2)-Equivariant Steerable CNNs

- Computer Science, MathematicsNeurIPS
- 2019

The theory of Steerable CNNs yields constraints on the convolution kernels which depend on group representations describing the transformation laws of feature spaces, and it is shown that these constraints for arbitrary group representations can be reduced to constraints under irreducible representations.

Invariant and Equivariant Graph Networks

- Computer Science, MathematicsICLR
- 2019

This paper provides a characterization of all permutation invariant and equivariant linear layers for (hyper-)graph data, and shows that their dimension, in case of edge-value graph data, is 2 and 15, respectively.

A Wigner-Eckart Theorem for Group Equivariant Convolution Kernels

- MathematicsICLR
- 2021

By generalizing the famous Wigner-Eckart theorem for spherical tensor operators, it is proved that steerable kernel spaces are fully understood and parameterized in terms of 1) generalized reduced matrix elements, 2) Clebsch-Gordan coefficients, and 3) harmonic basis functions on homogeneous spaces.

A General Theory of Equivariant CNNs on Homogeneous Spaces

- MathematicsNeurIPS
- 2019

The theory enables a systematic classification of all existing G-CNNs in terms of their symmetry group, base space, and field type and considers a fundamental question: what is the most general kind of equivariant linear map between feature spaces (fields) of given types.

Tensor Field Networks: Rotation- and Translation-Equivariant Neural Networks for 3D Point Clouds

- Computer ScienceArXiv
- 2018

Tensor field neural networks are introduced, which are locally equivariant to 3D rotations, translations, and permutations of points at every layer, and demonstrate the capabilities of tensor field networks with tasks in geometry, physics, and chemistry.