Corpus ID: 58981401

Training Neural Networks with Local Error Signals

@inproceedings{Nkland2019TrainingNN,
  title={Training Neural Networks with Local Error Signals},
  author={Arild N{\o}kland and Lars Hiller Eidnes},
  booktitle={ICML},
  year={2019}
}
  • Arild Nøkland, Lars Hiller Eidnes
  • Published in ICML 2019
  • Mathematics, Computer Science
  • Supervised training of neural networks for classification is typically performed with a global loss function. [...] Key Method We use single-layer sub-networks and two different supervised loss functions to generate local error signals for the hidden layers, and we show that the combination of these losses help with optimization in the context of local learning. Using local errors could be a step towards more biologically plausible deep learning because the global error does not have to be transported back to…Expand Abstract

    Citations

    Publications citing this paper.
    SHOWING 1-10 OF 29 CITATIONS

    Decoupled Greedy Learning of CNNs

    VIEW 2 EXCERPTS
    CITES METHODS & RESULTS

    Fully Decoupled Neural Network Learning Using Delayed Gradients

    VIEW 2 EXCERPTS
    CITES METHODS & BACKGROUND

    The Layer-Wise Training Convolutional Neural Networks Using Local Loss for Sensor-Based Human Activity Recognition

    VIEW 5 EXCERPTS
    CITES BACKGROUND & METHODS
    HIGHLY INFLUENCED

    Ensemble learning in CNN augmented with fully connected subnetworks

    VIEW 1 EXCERPT
    CITES METHODS

    FILTER CITATIONS BY YEAR

    2019
    2020

    CITATION STATISTICS

    • 2 Highly Influenced Citations

    • Averaged 15 Citations per year from 2019 through 2020

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 71 REFERENCES

    Deep Supervised Learning Using Local Errors

    VIEW 3 EXCERPTS

    Deep Cascade Learning

    VIEW 1 EXCERPT

    Greedy Layerwise Learning Can Scale to ImageNet

    VIEW 6 EXCERPTS
    HIGHLY INFLUENTIAL