Corpus ID: 231639318

Initialization Using Perlin Noise for Training Networks with a Limited Amount of Data

@article{Inoue2021InitializationUP,
  title={Initialization Using Perlin Noise for Training Networks with a Limited Amount of Data},
  author={Nakamasa Inoue and Eisuke Yamagata and H. Kataoka},
  journal={ArXiv},
  year={2021},
  volume={abs/2101.07406}
}
We propose a novel network initialization method using Perlin noise for training image classification networks with a limited amount of data. Our main idea is to initialize the network parameters by solving an artificial noise classification problem, where the aim is to classify Perlin noise samples into their noise categories. Specifically, the proposed method consists of two steps. First, it generates Perlin noise samples with category labels defined based on noise complexity. Second, it… Expand

Figures and Tables from this paper

References

SHOWING 1-10 OF 31 REFERENCES
Selective unsupervised feature learning with Convolutional Neural Network (S-CNN)
  • 12
  • PDF
Aggregated Residual Transformations for Deep Neural Networks
  • 3,423
  • PDF
Very Deep Convolutional Networks for Large-Scale Image Recognition
  • 46,254
  • PDF
Learning Multiple Layers of Features from Tiny Images
  • 10,967
  • PDF
Deep Residual Learning for Image Recognition
  • 62,694
  • Highly Influential
  • PDF
Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification
  • 9,340
  • PDF
Semi-Supervised Learning via Convolutional Neural Network for Hyperspectral Image Classification
  • 4
ImageNet classification with deep convolutional neural networks
  • 61,084
  • PDF
Deep Semi-Supervised Learning
  • 5
Semi-supervised convolutional neural networks with label propagation for image classification
  • 6
...
1
2
3
4
...