Learn More
Sigmoidal or radial transfer functions do not guarantee the best generalization nor fast learning of neural networks. Families of parameterized transfer functions provide flexible decision borders. Networks based on such transfer functions should be small and accurate. Several possibilities of using transfer functions of different types in neural models are(More)
Despite all the progress in neural networks field the technology is brittle and sometimes difficult to apply. Good initialization of adaptive parameters in neural networks and optimization of architecture are the key factor to create robust neural networks. Methods of initialization of MLPs are reviewed and new methods based on clusterization techniques are(More)
The choice of transfer functions may strongly influence complexity and performance of neural networks. Although sigmoidal transfer functions are the most common there is no a priori reason why models based on such functions should always provide optimal decision borders. A large number of alternative transfer functions has been described in the literature.(More)
Incremental Net Pro (IncNet Pro) with local learning feature and statistically controlled growing and pruning of the netwo r k i s i n tro-duced. The architecture of the net is based on RBF networks. Extended Kalman Filter algorithm and its new fast version is proposed and used as learning algorithm. IncNet Pro is similar to the Resource A llocation Network(More)
Initialization of adaptive parameters in neural networks is of crucial importance to the speed of convergence of the learning procedure. Methods of initialization for the density networks are reviewed and two new methods, based on decision trees and dendrograms, presented. These two methods were applied in the Feature Space Mapping framework to artificial(More)
This paper is an continuation of the accompanying paper with the same main title. The first paper reviewed instance selection algorithms, here results of empirical comparison and comments are presented. Several test were performed mostly on benchmark data sets from the machine learning repository at UCI. Instance selection algorithms were tested with neural(More)
Several methods were proposed to reduce the number of instances (vectors) in the learning set. Some of them extract only bad vectors while others try to remove as many instances as possible without significant degradation of the reduced dataset for learning. Several strategies to shrink training sets are compared here using different neural and machine(More)
The choice of transfer functions in neural networks is of crucial importance to their performance. Although sigmoidal transfer functions are the most common there is no a priori reason why they should be optimal in all cases. In this article advantages of various neural transfer functions are discussed and several new type of functions are introduced.(More)