Biologically Plausible Error-Driven Learning Using Local Activation Differences: The Generalized Recirculation Algorithm

@article{OReilly1996BiologicallyPE,
  title={Biologically Plausible Error-Driven Learning Using Local Activation Differences: The Generalized Recirculation Algorithm},
  author={R. O'Reilly},
  journal={Neural Computation},
  year={1996},
  volume={8},
  pages={895-938}
}
  • R. O'Reilly
  • Published 1996
  • Computer Science
  • Neural Computation
  • The error backpropagation learning algorithm (BP) is generally considered biologically implausible because it does not use locally available, activation-based variables. A version of BP that can be computed locally using bidirectional activation recirculation (Hinton and McClelland 1988) instead of backpropagated error derivatives is more biologically plausible. This paper presents a generalized version of the recirculation algorithm (GeneRec), which overcomes several limitations of the earlier… CONTINUE READING
    Generalization in Interactive Networks: The Benefits of Inhibitory Competition and Hebbian Learning
    133
    Contrastive Hebbian Feedforward Learning for Neural Networks
    Dendritic cortical microcircuits approximate the backpropagation algorithm
    62
    GAIT-prop: A biologically plausible learning rule derived from backpropagation of error
    An Approximation of the Error Backpropagation Algorithm in a Predictive Coding Network with Local Hebbian Synaptic Plasticity
    66
    Deep reinforcement learning in a time-continuous model

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 70 REFERENCES
    Deterministic Boltzmann Learning in Networks with Asymmetric Connectivity
    13
    Deterministic Boltzmann Learning Performs Steepest Descent in Weight-Space
    103
    Local Synaptic Learning Rules Suffice to Maximize Mutual Information in a Linear Network
    240
    Learning Representations by Recirculation
    119
    Mean Field Theory Neural Networks for Feature Recognition, Content Addressable Memory and Optimization
    9
    Connectionist Learning Procedures
    1063
    Dynamics and architecture for neural computation
    133
    The limitations of deterministic Boltzmann machine learning
    29