Corpus ID: 13165755

Invariant backpropagation: how to train a transformation-invariant neural network

@article{Demyanov2015InvariantBH,
  title={Invariant backpropagation: how to train a transformation-invariant neural network},
  author={S. Demyanov and J. Bailey and K. Ramamohanarao and C. Leckie},
  journal={ArXiv},
  year={2015},
  volume={abs/1502.04434}
}
In many classification problems a classifier should be robust to small variations in the input vector. This is a desired property not only for particular transformations, such as translation and rotation in image classification problems, but also for all others for which the change is small enough to retain the object perceptually indistinguishable. We propose two extensions of the backpropagation algorithm that train a neural network to be robust to variations in the feature vector. While the… Expand

References

SHOWING 1-10 OF 31 REFERENCES
Contractive Auto-Encoders: Explicit Invariance During Feature Extraction
Intriguing properties of neural networks
Deeply-Supervised Nets
Stacks of convolutional Restricted Boltzmann Machines for shift-invariant feature learning
Learning Multiple Layers of Features from Tiny Images
Training with Noise is Equivalent to Tikhonov Regularization
Network In Network
Unsupervised Learning of Invariant Feature Hierarchies with Applications to Object Recognition
...
1
2
3
4
...