Skip to search formSkip to main content
You are currently offline. Some features of the site may not work correctly.

Backpropagation

Known as: Error back-propagation, Backpropogation, Back prop 
Backpropagation, an abbreviation for "backward propagation of errors", is a common method of training artificial neural networks used in conjunction… Expand
Wikipedia

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
Highly Cited
2015
Highly Cited
2015
Top-performing deep architectures are trained on massive amounts of labeled data. In the absence of labeled data for a certain… Expand
  • figure 1
  • figure 2
  • table 1
  • figure 3
  • figure 4
Is this relevant?
Highly Cited
2014
Highly Cited
2014
We marry ideas from deep neural networks and approximate Bayesian inference to derive a generalised class of deep, directed… Expand
  • figure 1
  • figure 2
  • table 1
  • figure 3
  • figure 4
Is this relevant?
Highly Cited
2002
Highly Cited
2002
Abstract For a network of spiking neurons that encodes information in the timing of individual spike times, we derive a… Expand
  • figure 1
  • figure 2
  • figure 3
  • figure 4
  • figure 5
Is this relevant?
Highly Cited
2000
Highly Cited
2000
The conventional wisdom is that backprop nets with excess hidden units generalize poorly. We show that nets with excess capacity… Expand
  • figure 1
  • figure 2
  • figure 3
  • figure 4
  • figure 5
Is this relevant?
Highly Cited
1995
Highly Cited
1995
Contents: D.E. Rumelhart, R. Durbin, R. Golden, Y. Chauvin, Backpropagation: The Basic Theory. A. Waibel, T. Hanazawa, G. Hinton… Expand
Is this relevant?
Highly Cited
1993
Highly Cited
1993
A learning algorithm for multilayer feedforward networks, RPROP (resilient propagation), is proposed. To overcome the inherent… Expand
  • figure 1
  • figure 2
Is this relevant?
Highly Cited
1992
Highly Cited
1992
A quantitative and practical Bayesian framework is described for learning of mappings in feedforward networks. The framework… Expand
Is this relevant?
Highly Cited
1992
Highly Cited
1992
The authors propose a theoretical framework for backpropagation (BP) in order to identify some of its limitations as a general… Expand
  • figure 1
Is this relevant?
Highly Cited
1989
Highly Cited
1989
The ability of learning networks to generalize can be greatly enhanced by providing constraints from the task domain. This paper… Expand
Is this relevant?
Highly Cited
1988
Highly Cited
1988
Abstract Backpropagation is often viewed as a method for adapting artificial neural networks to classify patterns. Based on parts… Expand
  • figure 1
  • figure 2
  • figure 3
  • figure 4
  • table 1
Is this relevant?