Corpus ID: 204785044

Co-Attentive Equivariant Neural Networks: Focusing Equivariance On Transformations Co-Occurring In Data

@article{Romero2020CoAttentiveEN,
  title={Co-Attentive Equivariant Neural Networks: Focusing Equivariance On Transformations Co-Occurring In Data},
  author={David W. Romero and Mark Hoogendoorn},
  journal={ArXiv},
  year={2020},
  volume={abs/1911.07849}
}
Equivariance is a nice property to have as it produces much more parameter efficient neural architectures and preserves the structure of the input through the feature mapping. Even though some combinations of transformations might never appear (e.g. an upright face with a horizontal nose), current equivariant architectures consider the set of all possible transformations in a transformation group when learning feature representations. Contrarily, the human visual system is able to attend to the… Expand
Group Equivariant Subsampling
TLDR
Group equivariant autoencoders (GAEs) are used in models that learn object-centric representations on multiobject datasets, and show improved data efficiency and decomposition compared to non-equivariant baselines. Expand
Group Equivariant Neural Architecture Search via Group Decomposition and Reinforcement Learning
TLDR
It is shown that AENs find the right balance between group equivariance and number of parameters, thereby consistently having good task performance, yielding what the authors call autoequivariant networks (AENs). Expand
Equivariant Wavelets: Fast Rotation and Translation Invariant Wavelet Scattering Transforms
TLDR
This work introduces a fast-to-compute, translationally invariant and rotationally equivariant wavelet scattering network (EqWS) and filter bank of wavelets (triglets) and demonstrates the interpretability and quantify the invariance/equivariance of the coefficients. Expand
Exploiting Learned Symmetries in Group Equivariant Convolutions
TLDR
It is shown that GConvs can be efficiently decomposed into depthwise separable convolutions while preserving equivariance properties and demonstrated improved performance and data efficiency on two datasets. Expand
A Dynamic Group Equivariant Convolutional Networks for Medical Image Analysis
  • Y. Li, Guitao Cao, Wenming Cao
  • Computer Science
  • 2020 IEEE International Conference on Bioinformatics and Biomedicine (BIBM)
  • 2020
TLDR
This paper proposes a generalization of the dynamic Convolutional method, named as dynamic group equivariant convolution, to strengthen the relationships and increase model capability by aggregating multiple group convolutional kernels via attention, and demonstrates that breast tumor classification is substantial improvements when compared to a recent baseline architecture. Expand
Wavelet Networks: Scale Equivariant Learning From Raw Waveforms
TLDR
This work utilizes the concept of scale and translation equivariance to tackle the problem of learning on time-series from raw waveforms, and obtains representations that largely resemble those of the wavelet transform at the first layer, but that evolve into much more descriptive ones as a function of depth. Expand
Boosting Deep Neural Networks with Geometrical Prior Knowledge: A Survey
TLDR
This survey tries to give a concise overview about different approaches to incorporate geometrical prior knowledge into DNNs, and tries to connect those methods to the field of 3D object detection for autonomous driving, where they expect promising results applying those methods. Expand
Attentive Group Equivariant Convolutional Networks
TLDR
Attentive group equivariant convolutions are presented, a generalization of the group convolution, in which attention is applied during the course of convolution to accentuate meaningful symmetry combinations and suppress non-plausible, misleading ones. Expand
Autoequivariant Network Search via Group Decomposition
Recent works show that group equivariance as an inductive bias improves neural network performance for both classification and generation. However, designing group-equivariant neural networks isExpand
...
1
2
...

References

SHOWING 1-10 OF 42 REFERENCES
Harmonic Networks: Deep Translation and Rotation Equivariance
TLDR
H-Nets are presented, a CNN exhibiting equivariance to patch-wise translation and 360-rotation, and it is demonstrated that their layers are general enough to be used in conjunction with the latest architectures and techniques, such as deep supervision and batch normalization. Expand
Learning Steerable Filters for Rotation Equivariant CNNs
TLDR
Steerable Filter CNNs (SFCNNs) are developed which achieve joint equivariance under translations and rotations by design and generalize He's weight initialization scheme to filters which are defined as a linear combination of a system of atomic filters. Expand
Deep Rotation Equivariant Network
TLDR
This work proposes Deep Rotation Equivariant Network consisting of cycle layers, isotonic layers and decycle layers, and evaluates DRENs on Rotated MNIST and CIFAR-10 datasets and demonstrates that it can improve the performance of state-of-the-art architectures. Expand
Extracting Invariant Features From Images Using An Equivariant Autoencoder
TLDR
This work applies group convolutions to build an Equivariant Autoencoder with embeddings that change predictably under the specified set of transformations, and introduces two approaches to extracting invariant features from theseembeddings—Gram Pooling and Equivariants Attention. Expand
Rotation Equivariant Vector Field Networks
TLDR
The Rotation Equivariant Vector Field Networks (RotEqNet), a Convolutional Neural Network architecture encoding rotation equivariance, invariance and covariance, is proposed and a modified convolution operator relying on this representation to obtain deep architectures is developed. Expand
Gauge Equivariant Convolutional Networks and the Icosahedral CNN
TLDR
Gauge equivariant convolution using a single conv2d call is demonstrated, making it a highly scalable and practical alternative to Spherical CNNs and demonstrating substantial improvements over previous methods on the task of segmenting omnidirectional images and global climate patterns. Expand
Deep Symmetry Networks
TLDR
Deep symmetry networks (symnets), a generalization of convnets that forms feature maps over arbitrary symmetry groups that uses kernel-based interpolation to tractably tie parameters and pool over symmetry spaces of any dimension are introduced. Expand
Scale equivariance in CNNs with vector fields
TLDR
This work studies the effect of injecting local scale equivariance into Convolutional Neural Networks and shows that this improves the performance of the model by over 20% in the scale Equivariant task of regressing the scaling factor applied to randomly scaled MNIST digits. Expand
CubeNet: Equivariance to 3D Rotation and Translation
TLDR
A Group Convolutional Neural Network with linear equivariance to translations and right angle rotations in three dimensions is introduced, and is believed to be the first 3D rotation equivariant CNN for voxel representations. Expand
Exploiting Cyclic Symmetry in Convolutional Neural Networks
TLDR
This work introduces four operations which can be inserted into neural network models as layers, andWhich can be combined to make these models partially equivariant to rotations, and which enable parameter sharing across different orientations. Expand
...
1
2
3
4
5
...