Hyperparameter optimisation in differential evolution using Summed Local Difference Strings, A Rugged But Easily Calculated Landscape For Combinatorial Search Problems
- Department of Computer Science and Engineering, Thapar Institute
Patiala India 147004
hspannu@thapar.edu - Research Chair in Systems Biology / Director of GeneMill
University of Liverpool UK L69 3BX
Douglas.Kell@liverpool.ac.uk
Abstract
We analyse the effectiveness of differential evolution hyperparameters in large-scale search problems, i.e. those with very many variables or vector elements, using a novel objective function that is easily calculated from the vector/string itself. The objective function is simply the sum of the differences between adjacent elements. For both binary and real-valued elements whose smallest and largest values are min and max in a vector of length N, the value of the objective function ranges between 0 and (N-1) × (max-min) and can thus easily be normalised if desired. String length, population size and generations for computational iterations have been studied. Finally, a neural network is trained by systematically varying three hyper-parameters, viz population (NP), mutation factor (F) and crossover rate (CR), and two output target variables are collected (a) median (b) maximum cost function values from 10-trial experiments and compared with SMAC3 and OPTUNA against grid and random search.
Key words
Rugged landscape, differential evolution, neural networks, machine learning, optimization
Digital Object Identifier (DOI)
https://doi.org/10.2298/CSIS240628061P
How to cite
Pannu, H. S., Kell, D. B.: Hyperparameter optimisation in differential evolution using Summed Local Difference Strings, A Rugged But Easily Calculated Landscape For Combinatorial Search Problems. Computer Science and Information Systems, https://doi.org/10.2298/CSIS240628061P