Evolving neural networks that are both modular and regular: HyperNEAT plus the connection cost technique

@article{Huizinga2014EvolvingNN,
  title={Evolving neural networks that are both modular and regular: HyperNEAT plus the connection cost technique},
  author={Joost Huizinga and Jeff Clune and Jean-Baptiste Mouret},
  journal={Proceedings of the 2014 Annual Conference on Genetic and Evolutionary Computation},
  year={2014}
}
One of humanity's grand scientific challenges is to create artificially intelligent robots that rival natural animals in intelligence and agility. A key enabler of such animal complexity is the fact that animal brains are structurally organized in that they exhibit modularity and regularity, amongst other attributes. Modularity is the localization of function within an encapsulated unit. Regularity refers to the compressibility of the information describing a structure, and typically involves… 

Figures from this paper

Does Aligning Phenotypic and Genotypic Modularity Improve the Evolution of Neural Networks?

Results suggest encouraging modularity in both the genotype and phenotype as an important step towards solving large-scale multi-modal problems, but also indicate that more research is required before the authors can evolve structurally organized networks to solve tasks that require multiple, different neural modules.

Evolving programs to build artificial neural networks

This chapter evolves a pair of programs that build the network, one of which runs inside neurons and allows them to move, change, die or replicate, and the other is executed inside dendrites and allowing them to change length and weight, be removed, or replicate.

Guiding Neuroevolution with Structural Objectives

This work proposes two new structural objectives and tests their ability to guide evolving neural networks on two problems which can benefit from decomposition into subtasks and finds that both methods perform well on a problem with a very clear and decomposable structure.

Modularity in NEAT Reinforcement Learning Networks

It was shown that the ideal level of network modularity in the explored parameter space is highly dependent on other network variables, dispelling theories that modularity has a straightforward relationship to network performance and demonstrating that rewarding modularity directly did not improve fitness.

Evolving parsimonious networks by mixing activation functions

This work extends the neuroevolution algorithm NEAT to evolve the activation function of neurons in addition to the topology and weights of the network, and shows that the produced heterogeneous networks produced using NEAT are significantly smaller than homogeneous networks.

Information-theoretic neuro-correlates boost evolution of cognitive systems

It is found that judiciously chosen neuro-correlates can significantly aid GAs to find the highest peaks of the fitness landscape, in particular if the landscape is rugged and contains multiple peaks.

IMPROBED: Multiple Problem-Solving Brain via Evolved Developmental Programs

  • J. Miller
  • Biology, Computer Science
    Artificial Life
  • 2022
A simple neural model, called IMPROBED, is described, in which two neural programs construct an artificial brain that can simultaneously solve multiple computational problems.

Using Indirect Encoding of Multiple Brains to Produce Multimodal Behavior

Novel multimodal extensions to HyperNEAT, a popular indirect encoding for evolving many brains without assuming geometric relationships between them, are introduced and it is shown that multi-brain approaches are more effective than Hyper NEAT without multimodAL extensions, and that brains without a geometric relation to each other outperform situational policy geometry.

How Biological Concepts and Evolutionary Theories Are Inspiring Advances in Machine Intelligence

This work shows the numerous connections between theories based on coevolution, multi-level selection, modularity and competition and related developments in ANNs, focusing specifically on artificial neural networks that have become commonplace in machine learning.

Evolving the Behavior of Machines: From Micro to Macroevolution

References

SHOWING 1-10 OF 38 REFERENCES

Investigating whether hyperNEAT produces modular neural networks

The first documented case of HyperNEAT producing a modular phenotype is presented, but the inability to encourage modularity on harder problems where modularity would have been beneficial suggests that more work is needed to increase the likelihood that Hyper NEAT and similar algorithms produce modular ANNs in response to challenging, decomposable problems.

On the Relationships between Generative Encodings, Regularity, and Learning Abilities when Evolving Plastic Artificial Neural Networks

The results suggest that using a developmental encoding could improve the learning abilities of evolved, plastic neural networks, and reveal the consequence of the bias of developmental encodings towards regular structures.

Evolving scalable and modular adaptive networks with Developmental Symbolic Encoding

A novel developmental encoding for networks, featuring scalability, modularity, regularity and hierarchy is proposed, which allows to represent structural regularities of networks and build them from encapsulated and possibly reused subnetworks.

A Hypercube-Based Encoding for Evolving Large-Scale Neural Networks

The main conclusion is that the ability to explore the space of regular connectivity patterns opens up a new class of complex high-dimensional tasks to neuroevolution.

Generating large-scale neural networks through discovering geometric regularities

A method, called Hypercube-based Neuroevolution of Augmenting Topologies (HyperNEAT), which evolves a novel generative encoding called connective Compositional Pattern Producing Networks (connective CPPNs) to discover geometric regularities in the task domain, allowing the solution to both generalize and scale without loss of function to an ANN of over eight million connections.

Constraining connectivity to encourage modularity in HyperNEAT

This paper investigates how altering the traditional approach to determining whether connections are expressed in HyperNEAT influences modularity, and provides an important clue to how an indirect encoding of network structure can be encouraged to evolve modularity.

On the Performance of Indirect Encoding Across the Continuum of Regularity

This paper presents the first comprehensive study showing that phenotypic regularity enables an indirect encoding to outperform direct encoding controls as problem regularity increases, and suggests a path forward that combines indirect encodings with a separate process of refinement.

Evolving coordinated quadruped gaits with the HyperNEAT generative encoding

It is demonstrated that HyperNEAT, a new and promising generative encoding for evolving neural networks, can evolve quadruped gaits without an engineer manually decomposing the problem.

Evolving symmetric and modular neural networks for distributed control

The group-theoretic symmetry mutations of ENSO were also significantly more effective at evolving such controllers than random symmetry mutations, making it more evolvable than randomly mutating symmetry.

Evolving Neural Networks through Augmenting Topologies

A method is presented, NeuroEvolution of Augmenting Topologies (NEAT), which outperforms the best fixed-topology method on a challenging benchmark reinforcement learning task and shows how it is possible for evolution to both optimize and complexify solutions simultaneously.