Abstract
We consider a simple setting in neuroevolution where an evolutionary algorithm optimizes the weights and activation functions of a simple artificial neural network. We then define simple example functions to be learned by the network and conduct rigorous runtime analyses for networks with a single neuron and for a more advanced structure with several neurons and two layers. Our results show that the proposed algorithm is generally efficient on two example problems designed for one neuron and efficient with at least constant probability on the example problem for a two-layer network. In particular, the so-called harmonic mutation operator choosing steps of size j with probability proportional to 1/j turns out as a good choice for the underlying search space. However, for the case of one neuron, we also identify situations with hard-to-overcome local optima. Experimental investigations of our neu-roevolutionary algorithm and a state-of-the-art CMA-ES support the theoretical findings.
Original language | English |
---|---|
Title of host publication | Proceedings of the 17th ACM/SIGEVO Conference on Foundations of Genetic Algorithms |
Publisher | Association for Computing Machinery |
Publication date | 2023 |
Pages | 61-72 |
ISBN (Print) | 979-8-4007-0202-0 |
DOIs | |
Publication status | Published - 2023 |
Event | 17th ACM/SIGEVO Conference on Foundations of Genetic Algorithms - Potsdam, Germany Duration: 30 Aug 2023 → 1 Sept 2023 |
Conference
Conference | 17th ACM/SIGEVO Conference on Foundations of Genetic Algorithms |
---|---|
Country/Territory | Germany |
City | Potsdam |
Period | 30/08/2023 → 01/09/2023 |
Keywords
- Neuroevolution
- Theory
- Runtime analysis