Backpropagation

Known as: Error back-propagation, Backpropogation, Back prop 
Backpropagation, an abbreviation for "backward propagation of errors", is a common method of training artificial neural networks used in conjunction… (More)
Wikipedia

Topic mentions per year

Topic mentions per year

1980-2018
0500100019802018

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
Highly Cited
2015
Highly Cited
2015
Top-performing deep architectures are trained on massive amounts of labeled data. In the absence of labeled data for a certain… (More)
  • figure 1
  • figure 2
  • table 1
  • figure 3
  • figure 4
Is this relevant?
Highly Cited
2014
Highly Cited
2014
We marry ideas from deep neural networks and approximate Bayesian inference to derive a generalised class of deep, directed… (More)
  • figure 1
  • figure 2
  • table 1
  • figure 3
  • figure 4
Is this relevant?
Highly Cited
2008
Highly Cited
2008
Optical fiber transmission is impacted by linear and nonlinear impairments. We study the use of digital backpropagation (BP) in… (More)
  • figure 1
  • figure 3
  • figure 2
  • figure 4
  • table I
Is this relevant?
Highly Cited
2004
Highly Cited
2004
A new learning algorithm for multilayer feedforward networks, RPROP, is proposed. To overcome the inherent disadvantages of pure… (More)
  • figure 1
  • figure 2
Is this relevant?
Highly Cited
1997
Highly Cited
1997
We discuss a variety of Adaptive Critic Designs (ACDs) for neurocontrol. These are suitable for learning in noisy, nonlinear, and… (More)
  • figure 1
  • figure 2
  • figure 3
  • figure 4
  • figure 5
Is this relevant?
Highly Cited
1992
Highly Cited
1992
A quantitative and practical Bayesian framework is described for learning of mappings in feedforward networks. The framework… (More)
  • figure 1
  • figure 2
  • figure 3
  • figure 4
  • figure 5
Is this relevant?
Highly Cited
1992
Highly Cited
1992
In order to generalize from a training set to a test set, it is desirable that small changes in the input space of a pattern do… (More)
Is this relevant?
Highly Cited
1989
Highly Cited
1989
The ability of learning networks to generalize can be greatly enhanced by providing constraints from the task domain. This paper… (More)
  • figure 2
  • figure 3
Is this relevant?
Highly Cited
1989
Highly Cited
1989
  • Thomas Ash
  • International 1989 Joint Conference on Neural…
  • 1989
Summary form only given. A novel method called dynamic node creation (DNC) that attacks issues of training large networks and of… (More)
Is this relevant?
Highly Cited
1988
Highly Cited
1988
Some scientists have concluded that backpropagation is a specialized method for pattern classification, of little relevance to… (More)
Is this relevant?