Corpus ID: 207870611

Supervised level-wise pretraining for recurrent neural network initialization in multi-class classification

@article{Ienco2019SupervisedLP,
  title={Supervised level-wise pretraining for recurrent neural network initialization in multi-class classification},
  author={D. Ienco and Roberto Interdonato and Raffaele Gaetano},
  journal={ArXiv},
  year={2019},
  volume={abs/1911.01071}
}
  • D. Ienco, Roberto Interdonato, Raffaele Gaetano
  • Published 2019
  • Computer Science, Mathematics
  • ArXiv
  • Recurrent Neural Networks (RNNs) can be seriously impacted by the initial parameters assignment, which may result in poor generalization performances on new unseen data. With the objective to tackle this crucial issue, in the context of RNN based classification, we propose a new supervised layer-wise pretraining strategy to initialize network parameters. The proposed approach leverages a data-aware strategy that sets up a taxonomy of classification problems automatically derived by the model… CONTINUE READING

    Figures, Tables, and Topics from this paper.

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 17 REFERENCES
    Initialization matters: Orthogonal Predictive State Recurrent Neural Networks
    10
    DuPLO: A DUal view Point deep Learning architecture for time series classificatiOn
    23
    Land Cover Classification via Multitemporal Spatial Data by Deep Recurrent Neural Networks
    65
    Deep Residual Learning for Image Recognition
    48363
    Borrowing Treasures from the Wealthy: Deep Transfer Learning through Selective Joint Fine-Tuning
    106
    Understanding the difficulty of training deep feedforward neural networks
    8292
    Densely Connected Convolutional Networks
    9294