Skip to main content Skip to main navigation

Publication

Large-Scale Evolutionary Optimization of Artificial Neural Networks Using Adaptive Mutations

Rune Krauss; Jan Zielasko; Rolf Drechsler
In: Proceedings of the 7th Workshop on Accelerated Machine Learning (AccML). Workshop on Accelerated Machine Learning (AccML-2025), located at HiPEAC 2025, January 21, Barcelona, Spain, 2025.

Abstract

Big data forms the basis for training and testing models in Machine Learning (ML). The more data are available, the more effectively and accurately ML models can process them. Interest in ML has increased significantly as datasets are growing rapidly due to technological progress. One of the most popular ML models is an Artificial Neural Network (ANN), which is used in numerous areas such as medicine and the video game industry to analyze data and make decisions accordingly. In applications such as automatic speech recognition, sufficient labeled samples are provided to use supervised learning. Other applications such as robot control define an overall target while the expected output for a given input can be ambiguous, making supervised learning approaches not applicable. Neuroevolution, a kind of artificial intelligence that uses genetic algorithms for evolving ANNs, has been created to address this type of problem. However, one of the main issues is its mutation operator because it works randomly in general. This makes the exploration for performant ANNs more difficult or can easily disrupt feasible solutions. To tackle the aforementioned issue, this paper proposes adaptive mutations for ANN optimization. Experiments on several largescale benchmarks show that the proposed approach evolves efficient ANNs and clearly outperforms related work.