Optimal convergence of on-line backpropagation

Abstract

Many researchers are quite skeptical about the actual behavior of neural network learning algorithms like backpropagation. One of the major problems is with the lack of clear theoretical results on optimal convergence, particularly for pattern mode algorithms. In this paper, we prove the companion of Rosenblatt's PC (perceptron convergence) theorem for feedforward networks (1960), stating that pattern mode backpropagation converges to an optimal solution for linearly separable patterns.

DOI: 10.1109/72.478415

Extracted Key Phrases

Statistics

01020'01'03'05'07'09'11'13'15'17
Citations per Year

83 Citations

Semantic Scholar estimates that this publication has 83 citations based on the available data.

See our FAQ for additional information.

Cite this paper

@article{Gori1996OptimalCO, title={Optimal convergence of on-line backpropagation}, author={Marco Gori and Marco Maggini}, journal={IEEE transactions on neural networks}, year={1996}, volume={7 1}, pages={251-4} }