Evolving Neural Networks through a Reverse Encoding Tree

@article{Zhang2020EvolvingNN,
  title={Evolving Neural Networks through a Reverse Encoding Tree},
  author={Haoling Zhang and Chao-Han Huck Yang and Hector Zenil and Narsis Aftab Kiani and Yue Shen and Jesper N. Tegner},
  journal={2020 IEEE Congress on Evolutionary Computation (CEC)},
  year={2020},
  pages={1-10}
}
NeuroEvolution is one of the most competitive evolutionary learning strategies for designing novel neural networks for use in specific tasks, such as logic circuit design and digital gaming. However, the application of benchmark methods such as the NeuroEvolution of Augmenting Topologies (NEAT) remains a challenge, in terms of their computational cost and search time inefficiency. This paper advances a method which incorporates a type of topological edge coding, named Reverse Encoding Tree (RET… 

Modified Neural Architecture Search (NAS) Using the Chromosome Non-Disjunction

A deep neural network structuring methodology through a genetic algorithm (GA) using chromosome non-disjunction as a new genetic operation, which includes a destructive approach as well as a constructive approach and is similar to pruning methodologies, which realizes tuning of the previous neural network architecture.

Survey on Evolutionary Deep Learning: Principles, Algorithms, Applications and Open Issues

This paper aims to analyze EDL from the perspective of automated machine learning (AutoML) and regard EDL as an optimization problem, and systematically introduces EDL methods ranging from feature engineering, model generation, to model deployment with a new taxonomy.

Variational quantum reinforcement learning via evolutionary optimization

A hybrid framework is proposed where the quantum RL agents are equipped with a hybrid tensor network-variational quantum circuit (TN-VQC) architecture to handle inputs of dimensions exceeding the number of qubits, enabling further quantum RL applications on noisy intermediate-scale quantum devices.

Using Neural Networks to Identify Features Associated with HIV Nef Protein and Cancer

Using a dataset of well curated Nef sequences, evolved neural networks are trained to classify sequences as having originated from cancer or non-cancer samples to identify features that are possibly associated with Nef protein in HIV and its relation to cancer.

Students’ SS 2021 contributions: 186.837 Seminar of Computer Vision and Pattern Recognition

This technical report presents a selection of papers submitted by students in the course Seminar of Computer Vision and Pattern Recognition (SE 186.837) of the Pattern Recognition and Image

References

SHOWING 1-10 OF 55 REFERENCES

Evolving Neural Networks through Augmenting Topologies

A method is presented, NeuroEvolution of Augmenting Topologies (NEAT), which outperforms the best fixed-topology method on a challenging benchmark reinforcement learning task and shows how it is possible for evolution to both optimize and complexify solutions simultaneously.

Blocky Net: A New NeuroEvolution Method

A new network called Blocky Net with built-in feature selection, and a limited maximum parameter space and complexity is proposed with better performance on 13 of the 20 datasets tested versus 2 for FS-NEAT, and is better than NEAT in all cases.

Safe mutations for deep and recurrent neural networks through output gradients

A family of safe mutation (SM) operators that facilitate exploration without dramatically altering network behavior or requiring additional interaction with the environment are proposed, which dramatically increases the ability of a simple genetic algorithm-based neuroevolution method to find solutions in high-dimensional domains that require deep and/or recurrent neural networks.

Evolving Reusable Neural Modules

A coevolutionary modular neuroevolution method, Modular NeuroEvolution of Augmenting Topologies (Modular NEAT), is developed that automatically performs this decomposition during evolution, making evolutionary search more efficient.

Designing neural networks through neuroevolution

This Review looks at several key aspects of modern neuroevolution, including large-scale computing, the benefits of novelty and diversity, the power of indirect encoding, and the field’s contributions to meta-learning and architecture search.

Natural Evolution Speciation for NEAT

A new speciation technique is proposed, called Natural Evolution NEAT (NENEAT), that replaces NEAT’s speciation strategy with a new stategy inspired by cladistics, and can find smaller network solutions in fewer generation and population member evaluations than NEAT using the same number of children.

Automatic feature selection in neuroevolution

A novel method called FS-NEAT is presented which extends the NEAT neuroevolution method to automatically determine an appropriate set of inputs for the networks it evolves to address the feature selection problem without relying on meta-learning or labeled data.

Controllability, Multiplexing, and Transfer Learning in Networks using Evolutionary Learning

This study suggests that network-based computations of steady-state functions, representing either cellular modules of cell-to-cell communication networks or internal molecular circuits communicating within a cell, could be a powerful model for biologically inspired computing.

Genetic synthesis of Boolean neural networks with a cell rewriting developmental process

  • F. Gruau
  • Computer Science
    [Proceedings] COGANN-92: International Workshop on Combinations of Genetic Algorithms and Neural Networks
  • 1992
Genetic algorithms (GAS) are used to generate neural networks that implement Boolean functions by using chromosomes that encode an algorithmic description based upon a cell rewriting grammar to give modular and interpretable architectures with a powerful scalability property.

Does Aligning Phenotypic and Genotypic Modularity Improve the Evolution of Neural Networks?

Results suggest encouraging modularity in both the genotype and phenotype as an important step towards solving large-scale multi-modal problems, but also indicate that more research is required before the authors can evolve structurally organized networks to solve tasks that require multiple, different neural modules.
...