ABSTRACT
While evolutionary algorithms are known to be very successful for a broad range of applications, the algorithm designer is often left with many algorithmic choices, for example, the size of the population, the mutation rates, and the crossover rates of the algorithm. These parameters are known to have a crucial influence on the optimization time, and thus need to be chosen carefully, a task that often requires substantial efforts. Moreover, the optimal parameters can change during the optimization process. It is therefore of great interest to design mechanisms that dynamically choose best-possible parameters. An example for such an update mechanism is the one-fifth success rule for step-size adaption in evolutionary strategies. While in continuous domains this principle is well understood also from a mathematical point of view, no comparable theory is available for problems in discrete domains. In this work we show that the one-fifth success rule can be effective also in discrete settings. We regard the (1+(λ,λ)) GA proposed in [Doerr/Doerr/Ebel: From black-box complexity to designing new genetic algorithms, TCS 2015]. We prove that if its population size is chosen according to the one-fifth success rule then the expected optimization time on OneMax is linear. This is better than what any static population size λ can achieve and is asymptotically optimal also among all adaptive parameter choices.
- A. Auger. Benchmarking the (1+1) evolution strategy with one-fifth success rule on the BBOB-2009 function testbed. In Proc. of GECCO'09 (Companion), pages 2447--2452. ACM, 2009. Google ScholarDigital Library
- A. Auger and N. Hansen. Linear convergence on positively homogeneous functions of a comparison based step-size adaptive randomized search: the (1+1) ES with generalized one-fifth success rule. CoRR, abs/1310.8397, 2013.Google Scholar
- S. Böttcher, B. Doerr, and F. Neumann. Optimal fixed and adaptive mutation rates for the leadingones problem. In Proc. of PPSN'10, volume 6238 of LNCS, pages 1--10. Springer, 2010. Google ScholarDigital Library
- K. A. De Jong. An Analysis of the Behavior of a Class of Genetic Adaptive Systems. PhD thesis, 1975. Google ScholarDigital Library
- L. Devroye. The compound random search. PhD thesis, 1972.Google Scholar
- B. Doerr. Analyzing randomized search heuristics: Tools from probability theory. In A. Auger and B. Doerr, editors, Theory of Randomized Search Heuristics, pages 1--20. World Scientific Publishing, 2011.Google ScholarCross Ref
- B. Doerr and C. Doerr. Optimal parameter choices through self-adjustment: Applying the 1/5-th rule in discrete settings. CoRR, abs/1504.03212, 2015. Google ScholarDigital Library
- B. Doerr and C. Doerr. A tight bound for the runtime of the (1+(lambda, lambda)) genetic algorithm on onemax. In Proc. of GECCO'15. ACM, 2015. Google ScholarDigital Library
- B. Doerr, C. Doerr, and F. Ebel. Lessons from the black-box: Fast crossover-based genetic algorithms. In Proc. of GECCO'13, pages 781--788. ACM, 2013. Google ScholarDigital Library
- B. Doerr, C. Doerr, and F. Ebel. From black-box complexity to designing new genetic algorithms. Theoretical Computer Science, 567:87--104, 2015. Google ScholarDigital Library
- B. Doerr, T. Jansen, D. Sudholt, C. Winzen, and C. Zarges. Mutation rate matters even when optimizing monotonic functions. Evolutionary Computation, 21:1--27, 2013.Google ScholarDigital Library
- S. Droste, T. Jansen, and I. Wegener. On the analysis of the (1+1) evolutionary algorithm. Theoretical Computer Science, 276:51--81, 2002. Google ScholarDigital Library
- S. Droste, T. Jansen, and I. Wegener. Upper and lower bounds for randomized search heuristics in black-box optimization. Theory of Computing Systems, 39:525--544, 2006. Google ScholarDigital Library
- A. E. Eiben, R. Hinterding, and Z. Michalewicz. Parameter control in evolutionary algorithms. IEEE Transactions on Evolutionary Computation, 3:124--141, 1999. Google ScholarDigital Library
- A. E. Eiben and J. E. Smith. Introduction to Evolutionary Computing. Springer Verlag, 2003. Google ScholarCross Ref
- N. Hansen, A. Gawelczyk, and A. Ostermeier. Sizing the population with respect to the local progress in (1,lambda)-evolution strategies - a theoretical analysis. In Proc. of CEC'95, pages 80--85. IEEE, 1995.Google Scholar
- J. He and X. Yao. Drift analysis and average time complexity of evolutionary algorithms. Artificial Intelligence, 127:57--85, 2001. Google ScholarDigital Library
- J. Jagersküpper. Rigorous runtime analysis of the (1+1) ES: 1/5-rule and ellipsoidal fitness landscapes. In Proc. of FOGA'05, volume 3469 of LNCS, pages 260--281. Springer, 2005. Google ScholarDigital Library
- G. Karafotias, M. Hoogendoorn, and A. Eiben. Parameter control in evolutionary algorithms: Trends and challenges. IEEE Transactions on Evolutionary Computation, 19:167--187, 2015.Google ScholarDigital Library
- J. Lassig and D. Sudholt. Adaptive population models for offspring populations and parallel evolutionary algorithms. In Proc. of FOGA'11, pages 181--192. ACM, 2011. Google ScholarDigital Library
- R. Motwani and P. Raghavan. Randomized Algorithms. Cambridge University Press, 1995. Google ScholarCross Ref
- I. Rechenberg. Evolutionsstrategie. Friedrich Fromman Verlag (Günther Holzboog KG), 1973.Google Scholar
- M. Schumer and K. Steiglitz. Adaptive step size random search. IEEE Transactions on Automatic Control, 13(3):270--276, 1968.Google ScholarCross Ref
- D. Sudholt. Crossover speeds up building-block assembly. In Proc. of GECCO'12, pages 689--702. ACM, 2012. Google ScholarDigital Library
- O. Teytaud and S. Gelly. General lower bounds for evolutionary algorithms. In Proc. of PPSN'06, volume 4193 of LNCS, pages 21--31. Springer, 2006. Google ScholarDigital Library
- A. C.-C. Yao. Probabilistic computations: Toward a unified measure of complexity. In Proc. of FOCS'77, pages 222--227. IEEE, 1977. Google ScholarDigital Library
Index Terms
- Optimal Parameter Choices Through Self-Adjustment: Applying the 1/5-th Rule in Discrete Settings
Recommendations
Self-adjusting population sizes for non-elitist evolutionary algorithms: why success rates matter
GECCO '21: Proceedings of the Genetic and Evolutionary Computation ConferenceRecent theoretical studies have shown that self-adjusting mechanisms can provably outperform the best static parameters in evolutionary algorithms on discrete problems. However, the majority of these studies concerned elitist algorithms and we do not ...
Optimal Static and Self-Adjusting Parameter Choices for the $$(1+(\lambda ,\lambda ))$$(1+(ź,ź)) Genetic Algorithm
The $$(1+(\lambda ,\lambda ))$$(1+(ź,ź)) genetic algorithm proposed in Doerr et al. (Theor Comput Sci 567:87---104, 2015) is one of the few examples for which a super-constant speed-up of the expected optimization time through the use of crossover could ...
Self-adjusting Population Sizes for Non-elitist Evolutionary Algorithms: Why Success Rates Matter
AbstractEvolutionary algorithms (EAs) are general-purpose optimisers that come with several parameters like the sizes of parent and offspring populations or the mutation rate. It is well known that the performance of EAs may depend drastically on these ...
Comments