First Steps Towards a Runtime Analysis of Neuroevolution

Paul Fischer, Emil Lundt Larsen, Carsten Witt

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

41 Downloads (Orbit)

Abstract

We consider a simple setting in neuroevolution where an evolutionary algorithm optimizes the weights and activation functions of a simple artificial neural network. We then define simple example functions to be learned by the network and conduct rigorous runtime analyses for networks with a single neuron and for a more advanced structure with several neurons and two layers. Our results show that the proposed algorithm is generally efficient on two example problems designed for one neuron and efficient with at least constant probability on the example problem for a two-layer network. In particular, the so-called harmonic mutation operator choosing steps of size j with probability proportional to 1/j turns out as a good choice for the underlying search space. However, for the case of one neuron, we also identify situations with hard-to-overcome local optima. Experimental investigations of our neu-roevolutionary algorithm and a state-of-the-art CMA-ES support the theoretical findings.
Original languageEnglish
Title of host publicationProceedings of the 17th ACM/SIGEVO Conference on Foundations of Genetic Algorithms
PublisherAssociation for Computing Machinery
Publication date2023
Pages61-72
ISBN (Print)979-8-4007-0202-0
DOIs
Publication statusPublished - 2023
Event17th ACM/SIGEVO Conference on Foundations of Genetic Algorithms - Potsdam, Germany
Duration: 30 Aug 20231 Sept 2023

Conference

Conference17th ACM/SIGEVO Conference on Foundations of Genetic Algorithms
Country/TerritoryGermany
CityPotsdam
Period30/08/202301/09/2023

Keywords

  • Neuroevolution
  • Theory
  • Runtime analysis

Fingerprint

Dive into the research topics of 'First Steps Towards a Runtime Analysis of Neuroevolution'. Together they form a unique fingerprint.

Cite this