# Universal Approximations of Invariant Maps by Neural Networks

@article{Yarotsky2018UniversalAO, title={Universal Approximations of Invariant Maps by Neural Networks}, author={Dmitry Yarotsky}, journal={Constructive Approximation}, year={2018}, volume={55}, pages={407-474} }

We describe generalizations of the universal approximation theorem for neural networks to maps invariant or equivariant with respect to linear representations of groups. Our goal is to establish network-like computational models that are both invariant/equivariant and provably complete in the sense of their ability to approximate any continuous invariant/equivariant map. Our contribution is three-fold. First, in the general case of compact groups we propose a construction of a complete…

## 100 Citations

Universal Invariant and Equivariant Graph Neural Networks

- Mathematics, Computer ScienceNeurIPS
- 2019

The results show that a GNN defined by a single set of parameters can approximate uniformly well a function defined on graphs of varying size.

Universal Approximation Theorem for Equivariant Maps by Group CNNs

- Mathematics, Computer ScienceArXiv
- 2020

This paper provides a unified method to obtain universal approximation theorems for equivariant maps by CNNs in various settings and can handle non-linear equivariants between infinite-dimensional spaces for non-compact groups.

Universal approximations of permutation invariant/equivariant functions by deep neural networks

- Computer Science, MathematicsArXiv
- 2019

It is concluded that although the free parameters of the invariant/equivarint models are exponentially fewer than the one of the usual models, the invarian/equivariant models can approximate the invariants/Equivariant functions to arbitrary accuracy.

On the Universality of Invariant Networks

- Mathematics, Computer ScienceICML
- 2019

This paper concludes by proving a necessary condition for the universality of G-invariant networks that incorporate only first-order tensors, which are of special interest due to their practical value.

Scalars are universal: Equivariant machine learning, structured like classical physics

- Computer Science, PhysicsNeurIPS
- 2021

It is shown that it is simple to parameterize universally approximating polynomial functions that are equivariant under these symmetries, or under the Euclidean, Lorentz, and Poincaré groups, at any dimensionality d.

A Simple Proof of the Universality of Invariant/Equivariant Graph Neural Networks

- Mathematics, Computer ScienceArXiv
- 2019

This work considers a restricted intermediate hypothetical model named Graph Homomorphism Model to reach the universality conclusions including an open case for higher-order output and finds that the proposed technique not only leads to simple proofs of the universalities properties but also gives a natural explanation for the tensorization of the previously studied models.

Equivariant and Invariant Reynolds Networks

- MathematicsArXiv
- 2021

This paper constructs learning models based on the reductive Reynolds operator called equivariant and invariant Reynolds networks (ReyNets) and proves that they have universal approximation property.

Capacity of Group-invariant Linear Readouts from Equivariant Representations: How Many Objects can be Linearly Classified Under All Possible Views?

- MathematicsArXiv
- 2021

It is found that the fraction of separable dichotomies is determined by the dimension of the space that is fixed by the group action, and it is shown how this relation extends to operations such as convolutions, element-wise nonlinearities, and global and local pooling.

On Universal Equivariant Set Networks

- Computer ScienceICLR
- 2020

It is proved that PointNet is not equivariant universal; and adding a single linear transmission layer makes PointNet universal, and the architecture called PointNetST is argued to be the simplest permutation equivariants universal model known to date.

U NIVERSAL APPROXIMATIONS OF PERMUTATION INVARIANT / EQUIVARIANT FUNCTIONS BY DEEP NEURAL NETWORKS

- Mathematics, Computer Science
- 2019

A theory about the relationship between Ginvariant/equivariant functions and deep neural networks for finite group G is developed and it is shown that this universal approximator has exponentially fewer free parameters than usual models.

## References

SHOWING 1-10 OF 50 REFERENCES

Rotation Equivariant Vector Field Networks

- Computer Science2017 IEEE International Conference on Computer Vision (ICCV)
- 2017

The Rotation Equivariant Vector Field Networks (RotEqNet), a Convolutional Neural Network architecture encoding rotation equivariance, invariance and covariance, is proposed and a modified convolution operator relying on this representation to obtain deep architectures is developed.

Deep Symmetry Networks

- Computer ScienceNIPS
- 2014

Deep symmetry networks (symnets), a generalization of convnets that forms feature maps over arbitrary symmetry groups that uses kernel-based interpolation to tractably tie parameters and pool over symmetry spaces of any dimension are introduced.

Zeros of Equivariant Vector Fields: Algorithms for an Invariant Approach

- Mathematics, Computer ScienceJ. Symb. Comput.
- 1994

A computationally effective algorithm to solve for the zeros of a polynomial vector field equivariant with respect to a finite subgroup of O (n) is presented and it is proved that the module of equivariants is Cohen-Macaulay.

Convolutional Rectifier Networks as Generalized Tensor Decompositions

- Computer ScienceICML
- 2016

Developing effective methods for training convolutional arithmetic circuits may give rise to a deep learning architecture that is provably superior to Convolutional rectifier networks, which has so far been overlooked by practitioners.

Polar Transformer Networks

- Computer ScienceICLR
- 2018

PTN combines ideas from the Spatial Transformer Network (STN) and canonical coordinate representations and is a network invariant to translation and equivariant to both rotation and scale, which is extensible to 3D which is demonstrated through the Cylindrical Trans transformer Network.

Group Equivariant Convolutional Networks

- Computer ScienceICML
- 2016

Group equivariant Convolutional Neural Networks (G-CNNs), a natural generalization of convolutional neural networks that reduces sample complexity by exploiting symmetries and achieves state of the art results on CI- FAR10 and rotated MNIST.

On the approximate realization of continuous mappings by neural networks

- Computer ScienceNeural Networks
- 1989

Warped Convolutions: Efficient Invariance to Spatial Transformations

- MathematicsICML
- 2017

This work presents a construction that is simple and exact, yet has the same computational complexity that standard convolutions enjoy, consisting of a constant image warp followed by a simple convolution, which are standard blocks in deep learning toolboxes.

Exploiting Cyclic Symmetry in Convolutional Neural Networks

- Computer ScienceICML
- 2016

This work introduces four operations which can be inserted into neural network models as layers, andWhich can be combined to make these models partially equivariant to rotations, and which enable parameter sharing across different orientations.

The classical groups : their invariants and representations

- Mathematics
- 1940

In this renowned volume, Hermann Weyl discusses the symmetric, full linear, orthogonal, and symplectic groups and determines their different invariants and representations. Using basic concepts from…