Skip to search formSkip to main content
You are currently offline. Some features of the site may not work correctly.

Backpropagation

Known as: Error back-propagation, Backpropogation, Back prop 
Backpropagation, an abbreviation for "backward propagation of errors", is a common method of training artificial neural networks used in conjunction… Expand
Wikipedia

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
Highly Cited
2015
Highly Cited
2015
Top-performing deep architectures are trained on massive amounts of labeled data. In the absence of labeled data for a certain… Expand
  • figure 1
  • figure 2
  • table 1
  • figure 3
  • figure 4
Highly Cited
2014
Highly Cited
2014
We marry ideas from deep neural networks and approximate Bayesian inference to derive a generalised class of deep, directed… Expand
  • figure 1
  • figure 2
  • table 1
  • figure 3
  • figure 4
Highly Cited
2000
Highly Cited
2000
The conventional wisdom is that backprop nets with excess hidden units generalize poorly. We show that nets with excess capacity… Expand
  • figure 1
  • figure 2
  • figure 3
  • figure 4
  • figure 5
Highly Cited
1993
Highly Cited
1993
A learning algorithm for multilayer feedforward networks, RPROP (resilient propagation), is proposed. To overcome the inherent… Expand
  • figure 1
  • figure 2
Highly Cited
1992
Highly Cited
1992
  • D. Mackay
  • Neural Computation
  • 1992
  • Corpus ID: 16543854
A quantitative and practical Bayesian framework is described for learning of mappings in feedforward networks. The framework… Expand
Highly Cited
1992
Highly Cited
1992
  • M. Gori, A. Tesi
  • IEEE Trans. Pattern Anal. Mach. Intell.
  • 1992
  • Corpus ID: 8098333
The authors propose a theoretical framework for backpropagation (BP) in order to identify some of its limitations as a general… Expand
  • figure 1
Review
1990
Review
1990
Fundamental developments in feedforward artificial neural networks from the past thirty years are reviewed. The history… Expand
Highly Cited
1989
Highly Cited
1989
The ability of learning networks to generalize can be greatly enhanced by providing constraints from the task domain. This paper… Expand
Review
1989
Review
1989
  • R. Hecht-Nielsen
  • International Joint Conference on Neural…
  • 1989
  • Corpus ID: 5691634
The author presents a survey of the basic theory of the backpropagation neural network architecture covering architectural design… Expand
Review
1989
Review
1989
The author presents a survey of the basic theory of the backpropagation neural network architecture covering architectural design… Expand