Learn More
— The EENCL algorithm [1] has been proposed as a method for designing neural network ensembles for classification tasks, combining global evolution with a local search based on gradient descent. Two mechanisms encourage diversity: Negative Correlation Learning (NCL) and implicit fitness sharing. In order to better understand the success of EENCL, this work(More)
The EENCL algorithm [1] automatically designs neural network ensembles for classification, combining global evolution with local search based on gradient descent. Two mechanisms encourage diversity: Negative Correlation Learning (NCL) and implicit fitness sharing. This paper analyses EENCL, finding that NCL is not an essential component of the algorithm,(More)
Manual design of neural network ensembles is a time-consuming and complex task. We aim to develop methods to automatically evolve ensembles to solve arbitrary classification tasks. An existing algorithm (EENCL) has been evaluated with regard to accuracy and diversity of the created ensembles. Prior to these studies the parameters of the algorithm have been(More)
  • 1