Equivalence of Backpropagation and Contrastive Hebbian Learning in a Layered Network

@article{Xie2003EquivalenceOB,
  title={Equivalence of Backpropagation and Contrastive Hebbian Learning in a Layered Network},
  author={Xiaohui Xie and H. Sebastian Seung},
  journal={Neural Computation},
  year={2003},
  volume={15},
  pages={441-454}
}
Backpropagation and contrastive Hebbian learning are two methods of training networks with hidden neurons. Backpropagation computes an error signal for the output neurons and spreads it over the hidden neurons. Contrastive Hebbian learning involves clamping the output neurons at desired values and letting the effect spread through feedback connections over the entire network. To investigate the relationship between these two forms of learning, we consider a special case in which they are… CONTINUE READING

Citations

Publications citing this paper.
SHOWING 1-10 OF 67 CITATIONS

Contrastive Hebbian learning with random feedback weights

VIEW 7 EXCERPTS
CITES BACKGROUND, RESULTS & METHODS
HIGHLY INFLUENCED

Equilibrium Propagation: Bridging the Gap between Energy-Based Models and Backpropagation

  • Front. Comput. Neurosci.
  • 2016
VIEW 5 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Contrastive Learning for Lifted Networks

VIEW 5 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Dictionary Learning by Dynamical Neural Networks

VIEW 1 EXCERPT
CITES RESULTS
HIGHLY INFLUENCED

Towards a Biologically Plausible Backprop

VIEW 3 EXCERPTS
CITES BACKGROUND
HIGHLY INFLUENCED

FILTER CITATIONS BY YEAR

2004
2019

CITATION STATISTICS

  • 7 Highly Influenced Citations

  • Averaged 9 Citations per year from 2017 through 2019

References

Publications referenced by this paper.
SHOWING 1-10 OF 15 REFERENCES