Learn More
The EENCL algorithm [1] has been proposed as a method for designing neural network ensembles for classification tasks, combining global evolution with a local search based on gradient descent. Two mechanisms encourage diversity: Negative Correlation Learning (NCL) and implicit fitness sharing. In order to better understand the success of EENCL, this work(More)
In this thesis, we present our investigation and developments of neural network ensembles, which have attracted a lot of research interests in machine learning and have many fields of applications. More specifically, the thesis focuses on two important factors of ensembles: the diversity among ensemble members and the regularization. Firstly, we investigate(More)
The EENCL algorithm [1] automatically designs neural network ensembles for classification, combining global evolution with local search based on gradient descent. Two mechanisms encourage diversity: Negative Correlation Learning (NCL) and implicit fitness sharing. This paper analyses EENCL, finding that NCL is not an essential component of the algorithm,(More)
  • 1