Corpus ID: 236447708

Circular-Symmetric Correlation Layer based on FFT

@article{Azari2021CircularSymmetricCL,
  title={Circular-Symmetric Correlation Layer based on FFT},
  author={Bahar Azari and Deniz Erdoğmuş},
  journal={ArXiv},
  year={2021},
  volume={abs/2107.12480}
}
Despite the vast success of standard planar convolutional neural networks, they are not the most efficient choice for analyzing signals that lie on an arbitrarily curved manifold, such as a cylinder. The problem arises when one performs a planar projection of these signals and inevitably causes them to be distorted or broken where there is valuable information. We propose a Circular-symmetric Correlation Layer (CCL) based on the formalism of roto-translation equivariant correlation on the… Expand

Figures and Tables from this paper

References

SHOWING 1-10 OF 34 REFERENCES
Exploiting Cyclic Symmetry in Convolutional Neural Networks
TLDR
This work introduces four operations which can be inserted into neural network models as layers, andWhich can be combined to make these models partially equivariant to rotations, and which enable parameter sharing across different orientations. Expand
Spatial Transformer Networks
TLDR
This work introduces a new learnable module, the Spatial Transformer, which explicitly allows the spatial manipulation of data within the network, and can be inserted into existing convolutional architectures, giving neural networks the ability to actively spatially transform feature maps. Expand
Spherical CNNs
TLDR
A definition for the spherical cross-correlation is proposed that is both expressive and rotation-equivariant and satisfies a generalized Fourier theorem, which allows us to compute it efficiently using a generalized (non-commutative) Fast Fourier Transform (FFT) algorithm. Expand
Deep Symmetry Networks
TLDR
Deep symmetry networks (symnets), a generalization of convnets that forms feature maps over arbitrary symmetry groups that uses kernel-based interpolation to tractably tie parameters and pool over symmetry spaces of any dimension are introduced. Expand
Harmonic Networks: Deep Translation and Rotation Equivariance
TLDR
H-Nets are presented, a CNN exhibiting equivariance to patch-wise translation and 360-rotation, and it is demonstrated that their layers are general enough to be used in conjunction with the latest architectures and techniques, such as deep supervision and batch normalization. Expand
Permutation-equivariant neural networks applied to dynamics prediction
TLDR
A permutation-invariant neural network layer is discussed in analogy to convolutional layers, and the ability of this architecture to learn to predict the motion of a variable number of interacting hard discs in 2D is shown. Expand
Circular Convolutional Neural Networks for Panoramic Images and Laser Data
TLDR
This paper defines circular convolutional and circular transposed Convolutional layers as the replacement of their linear counterparts, and identifies pros and cons of applying CCNNs, and evaluates their properties using a circular MNIST classification and a Velodyne laserscanner segmentation dataset. Expand
On the Generalization of Equivariance and Convolution in Neural Networks to the Action of Compact Groups
TLDR
It is proved that (given some natural constraints) convolutional structure is not just a sufficient, but also a necessary condition for equivariance to the action of a compact group. Expand
Group Equivariant Convolutional Networks
TLDR
Group equivariant Convolutional Neural Networks (G-CNNs), a natural generalization of convolutional neural networks that reduces sample complexity by exploiting symmetries and achieves state of the art results on CI- FAR10 and rotated MNIST. Expand
Steerable CNNs
TLDR
This paper presents Steerable Convolutional Neural Networks, an efficient and flexible class of equivariant convolutional networks, and shows how the parameter cost of a steerable filter bank depends on the types of the input and output features. Expand
...
1
2
3
4
...