Evolution through Large Models

@article{Lehman2022EvolutionTL,
  title={Evolution through Large Models},
  author={Joel Lehman and Jonathan Gordon and Shawn Jain and Kamal Ndousse and Cathy Yeh and Kenneth O. Stanley},
  journal={ArXiv},
  year={2022},
  volume={abs/2206.08896}
}
This paper pursues the insight that large language models (LLMs) trained to generate code can vastly improve the effectiveness of mutation operators applied to programs in genetic programming (GP). Because such LLMs benefit from training data that includes sequential changes and modifications, they can approximate likely changes that humans would make. To highlight the breadth of implications of such evolution through large models (ELM), in the main experiment ELM combined with MAP-Elites… 

Knowledge-Driven Program Synthesis via Adaptive Replacement Mutation and Auto-constructed Subprogram Archives

TLDR
A novel method based on PushGP is proposed to solve the KDPS problem, which takes subprograms as knowledge and achieves better train error, success count, and faster convergence than PushGP.