The sensitivity of HyperNEAT to different geometric representations of a problem

@article{Clune2009TheSO,
  title={The sensitivity of HyperNEAT to different geometric representations of a problem},
  author={Jeff Clune and Charles Ofria and Robert T. Pennock},
  journal={Proceedings of the 11th Annual conference on Genetic and evolutionary computation},
  year={2009}
}
HyperNEAT, a generative encoding for evolving artificial neural networks (ANNs), has the unique and powerful ability to exploit the geometry of a problem (e.g., symmetries) by encoding ANNs as a function of a problem's geometry. This paper provides the first extensive analysis of the sensitivity of HyperNEAT to different geometric representations of a problem. Understanding how geometric representations affect the quality of evolved solutions should improve future designs of such… 

Figures and Tables from this paper

HyperNeat Plus the Connection Cost Technique

It is shown that adding the connection cost technique to Hyper NEAT produces neural networks that are significantly more modular, regular, and higher performing than HyperNEAT without a connection cost, even when compared to a variant of HyperNEat that was specifically designed to encourage modularity.

Learning From Geometry In Learning For Tactical And Strategic Decision Domains

This dissertation presents a new NE algorithm called Hypercube-based NeuroEvolution of Augmenting Topologies (HyperNEAT), based on a novel indirect encoding of ANNs, designed to work in tactical and strategic decision domains.

On the Performance of Indirect Encoding Across the Continuum of Regularity

This paper presents the first comprehensive study showing that phenotypic regularity enables an indirect encoding to outperform direct encoding controls as problem regularity increases, and suggests a path forward that combines indirect encodings with a separate process of refinement.

Evolving neural networks that are both modular and regular: HyperNEAT plus the connection cost technique

It is shown that adding the connection cost technique to Hyper NEAT produces neural networks that are significantly more modular, regular, and higher performing than HyperNEAT without a connection cost, even when compared to a variant of HyperNEat that was specifically designed to encourage modularity.

Investigating whether hyperNEAT produces modular neural networks

The first documented case of HyperNEAT producing a modular phenotype is presented, but the inability to encourage modularity on harder problems where modularity would have been beneficial suggests that more work is needed to increase the likelihood that Hyper NEAT and similar algorithms produce modular ANNs in response to challenging, decomposable problems.

Evolving Neural Networks for Visual Processing

positive results were achieved for non-trivial tasks, and some important characteristics of HyperNEAT not previously reported were discovered: a bias towards generating weight patterns aligned with the topographical arrangement of neurons within the layers of the substrate network; the ability to evolve solutions more quickly with larger substrate networks; and a possible difficulty co-evolving weight patterns for substrate networks containing more than one hidden layer.

Autonomous Evolution of Topographic Regularities in Artificial Neural Networks

This letter shows that when geometry is introduced to evolved ANNs through the hypercube-based neuroevolution of augmenting topologies algorithm, they begin to acquire characteristics that indeed are reminiscent of biological brains.

Evolving robot gaits in hardware: the HyperNEAT generative encoding vs. parameter optimization

This paper compares the performance of two classes of gait-learning algorithms: locally searching parameterized motion models and evolving artificial neural networks with the HyperNEAT generative encoding and a new method that builds a model of the fitness landscape with linear regression to guide further exploration.

Evolving Gaits for Physical Robots with the HyperNEAT Generative Encoding: The Benefits of Simulation

This paper tested the hypothesis that the beneficial properties of Hyper NEAT would outperform the simpler encoding if HyperNEAT gaits are first evolved in simulation before being transferred to reality, and it was confirmed, resulting in the fastest gaits yet observed for this robot.

Toward evolving robust, deliberate motion planning with HyperNEAT

Results demonstrate that although HyperNEAT was not able to achieve as robust results as a hand-design approach, the best strategy was comparable, with just a 3–4% drop in performance.
...

References

SHOWING 1-10 OF 19 REFERENCES

A Case Study on the Critical Role of Geometric Regularity in Machine Learning

It is argued that geometric information is critical to the ability of any machine learning approach to effectively generalize; even a small shift in the configuration of the task in space from what was experienced in training can go wholly unrecognized unless the algorithm is able to learn the regularities in decision-making across the problem geometry.

Evolving coordinated quadruped gaits with the HyperNEAT generative encoding

It is demonstrated that HyperNEAT, a new and promising generative encoding for evolving neural networks, can evolve quadruped gaits without an engineer manually decomposing the problem.

A Hypercube-Based Encoding for Evolving Large-Scale Neural Networks

The main conclusion is that the ability to explore the space of regular connectivity patterns opens up a new class of complex high-dimensional tasks to neuroevolution.

A Hypercube-Based Indirect Encoding for Evolving Large-Scale Neural Networks

The main conclusion is that the ability to explore the space of regular connectivity patterns opens up a new class of complex high-dimensional tasks to neuroevolution.

Generative representations for the automated design of modular physical robots

This work demonstrates an automatic design system that produces complex robots by exploiting the principles of regularity, modularity, hierarchy, and reuse, and demonstrates for the first time the evolution and construction of modular, three-dimensional, physically locomoting robots.

How a Generative Encoding Fares as Problem-Regularity Decreases

As the regularity of the problem decreases, the performance of the generative representation degrades to, and then underperforms, the direct encoding, yet tends to be consistent for different types of problem regularity.

Modular neuroevolution for multilegged locomotion

The results suggest that the modular approach is effective for designing robust locomotion controllers for multilegged robots and scales well when the number of legs or their degrees of freedom are increased.

Generative encoding for multiagent learning

The Hypercube-based NeuroEvolution of Augmenting Topologies (HyperNEAT) generative approach to evolving neurocontrollers learns a set of coordinated policies encoded by a single genome representing a team of predator agents that work together to capture prey.

Automatic Definition of Modular Neural Networks

  • F. Gruau
  • Computer Science
    Adapt. Behav.
  • 1994
An artificial developmental system that is a computationally efficient technique for the automatic generation of complex artificial neural networks (ANNs) and some simulation results showing that the same problem cannot be solved if the mechanism for automatic definition of subnetworks is suppressed.

Evolving 3D Morphology and Behavior by Competition

This article describes a system for the evolution and coevolution of virtual creatures that compete in physically simulated three-dimensional worlds that can adapt to each other as they evolve simultaneously.