Circular backpropagation networks for classification
@article{Ridella1997CircularBN,
title={Circular backpropagation networks for classification},
author={Sandro Ridella and Stefano Rovetta and Rodolfo Zunino},
journal={IEEE transactions on neural networks},
year={1997},
volume={8 1},
pages={
84-97
}
}The class of mapping networks is a general family of tools to perform a wide variety of tasks. This paper presents a standardized, uniform representation for this class of networks, and introduces a simple modification of the multilayer perceptron with interesting practical properties, especially well suited to cope with pattern classification tasks. The proposed model unifies the two main representation paradigms found in the class of mapping networks for classification, namely, the surface…
Figures, Tables, and Topics from this paper
143 Citations
CBP networks as a generalized neural model
- Computer ScienceProceedings of International Conference on Neural Networks (ICNN'97)
- 1997
The proposed model unifies the two main representation paradigms found in the class of mapping networks for classification, namely, the surface-based and the prototype-based schemes, while retaining the advantage of being trainable by back-propagation.
Classification methodologies of multilayer perceptrons with sigmoid activation functions
- Computer SciencePattern Recognit.
- 2005
Adaptive RBF neural networks for pattern classifications
- Computer ScienceProceedings of the 2002 International Joint Conference on Neural Networks. IJCNN'02 (Cat. No.02CH37290)
- 2002
A classification application shows that the proposed adaptive algorithm is able to optimally determine the structures and parameters of the RBF-LBF networks in accordance with the characteristics of sample distribution, has higher convergence rate and classification precision as well as many other advantages, compared with the feedforward two-layered LBF and RBF networks.
Classification, Association and Pattern Completion using Neural Similarity Based Methods
- Computer Science
- 2000
A framework for Similarity-Based Methods (SBMs) includes many classification models as special cases: neural network of the Radial Basis Function Networks type, Feature Space Mapping neurofuzzy…
Representation and generalization properties of class-entropy networks
- Computer ScienceIEEE Trans. Neural Networks
- 1999
The paper proves several theoretical properties about the performance of CCE-based networks, and considers both convergence during training and generalization ability at run-time, and proposes analytical criteria and practical procedures to enhance the generalization performance of the trained networks.
Neural Networks from Similarity Based Perspective
- Computer Science
- 2000
A framework for Similarity-Based Methods (SBMs) includes many neural network models as special cases, useful not only for classification and approximation, but also as associative memories, in problems requiring pattern completion, offering an efficient way to deal with missing values.
Enhancing the Generalization Ability of Backpropagation Algorithm through Controlling the Outputs of the Hidden Layers
- Computer Science
- 2002
The proposed algorithm provides better generalization results than the basic backpropagation algorithm through controlling the outputs of the hidden layers and some conventional regularization methods, such as Laplace and Gaussian regularizer.
Enhancing the generalization ability of neural networks through controlling the hidden layers
- Computer ScienceAppl. Soft Comput.
- 2009
From multilayer perceptrons to radial basis function networks: a comparative study
- Computer ScienceIEEE Conference on Cybernetics and Intelligent Systems, 2004.
- 2004
A special additional input, which is the sum of the squares of the other inputs, is added to the standard multilayer perceptron, so that the multilayer perceptron works similarly as the radial basis…
References
SHOWING 1-10 OF 47 REFERENCES
Generalization and PAC learning: some new results for the class of generalized single-layer networks
- Computer ScienceIEEE Trans. Neural Networks
- 1995
It is shown that the use of self-structuring techniques for GSLNs may reduce the number of training examples sufficient to guarantee good generalization performance, and an explanation for the fact that GSLNs can require a relatively large number of weights is provided.
On the Relationship between Generalization Error, Hypothesis Complexity, and Sample Complexity for Radial Basis Functions
- Computer ScienceNeural Computation
- 1996
This article shows that the generalization error can be decomposed into two terms: the approximation error, due to the insufficient representational capacity of a finite sized network, and the estimation error,due to insufficient information about the target function because of the finite number of samples.
Neural networks for pattern recognition
- Computer Science
- 1995
This is the first comprehensive treatment of feed-forward neural networks from the perspective of statistical pattern recognition, and is designed as a text, with over 100 exercises, to benefit anyone involved in the fields of neural computation and pattern recognition.
Bounds on the number of hidden neurons in multilayer perceptrons
- Computer ScienceIEEE Trans. Neural Networks
- 1991
A least upper bound is derived for the number of hidden neurons needed to realize an arbitrary function which maps from a finite subset of E(n) into E(d) and a nontrivial lower bound is obtained for realizations of injective functions.
Boosting the Performance of RBF Networks with Dynamic Decay Adjustment
- Computer ScienceNIPS
- 1994
The Dynamic Decay Adjustment (DDA) algorithm is introduced which utilizes the constructive nature of the P-RCE algorithm together with independent adaptation of each prototype's decay factor and is class dependent and distinguishes between different neighbours.
Automatic Capacity Tuning of Very Large VC-Dimension Classifiers
- Computer ScienceNIPS
- 1992
It is shown that even high-order polynomial classifiers in high dimensional spaces can be trained with a small amount of training data and yet generalize better than classifiers with a smaller VC-dimension.
An introduction to computing with neural nets
- Computer ScienceIEEE ASSP Magazine
- 1987
This paper provides an introduction to the field of artificial neural nets by reviewing six important neural net models that can be used for pattern classification and exploring how some existing classification and clustering algorithms can be performed using simple neuron-like components.
Counting Function Theorem for Multi-Layer Networks
- Computer ScienceNIPS
- 1993
We show that a randomly selected N-tuple x→ of points of Rn with probability > 0 is such that any multi-layer percept ron with the first hidden layer composed of h1 threshold logic units can…
Fast Learning in Networks of Locally-Tuned Processing Units
- Computer ScienceNeural Computation
- 1989
We propose a network architecture which uses a single internal layer of locally-tuned processing units to learn both classification tasks and real-valued function approximations (Moody and Darken…












