Data Set Used
Most methods for the evolutionary generation of multi-layer perceptron classifiers use a divide-and-conquer strategy, where the tasks of feature selection, structure design, and weight training are performed separately. The concurrent evolution of the whole classifier has been seldom attempted and its effectiveness has never been exhaustively benchmarked.… (More)
This paper presents a novel version of the bees algorithm. This version is characterized by an extended set of search operators, and a mechanism that protects the most recently generated solutions from competition with more evolved individuals. Compared to the standard implementation of the bees algorithm, the new procedure requires the selection of an… (More)
This paper presents FeaSANNT, an evolutionary procedure for feature selection and weight training for neural network classifiers. FeaSANNT exploits the global nature of evolutionary search to avoid sub-optimal peaks of performance. FeaSANNT was used to train a multi-layer perceptron classifier on seven benchmark problems. FeaSANNT attained accurate and… (More)
We show how to use intensively local cone approximations to obtain results in some fields of optimization theory such as optimality conditions, constraint qualifications, mean value theorems and error bound.