skip to main content
10.1145/2739480.2754684acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
research-article

Optimal Parameter Choices Through Self-Adjustment: Applying the 1/5-th Rule in Discrete Settings

Authors Info & Claims
Published:11 July 2015Publication History

ABSTRACT

While evolutionary algorithms are known to be very successful for a broad range of applications, the algorithm designer is often left with many algorithmic choices, for example, the size of the population, the mutation rates, and the crossover rates of the algorithm. These parameters are known to have a crucial influence on the optimization time, and thus need to be chosen carefully, a task that often requires substantial efforts. Moreover, the optimal parameters can change during the optimization process. It is therefore of great interest to design mechanisms that dynamically choose best-possible parameters. An example for such an update mechanism is the one-fifth success rule for step-size adaption in evolutionary strategies. While in continuous domains this principle is well understood also from a mathematical point of view, no comparable theory is available for problems in discrete domains. In this work we show that the one-fifth success rule can be effective also in discrete settings. We regard the (1+(λ,λ)) GA proposed in [Doerr/Doerr/Ebel: From black-box complexity to designing new genetic algorithms, TCS 2015]. We prove that if its population size is chosen according to the one-fifth success rule then the expected optimization time on OneMax is linear. This is better than what any static population size λ can achieve and is asymptotically optimal also among all adaptive parameter choices.

References

  1. A. Auger. Benchmarking the (1+1) evolution strategy with one-fifth success rule on the BBOB-2009 function testbed. In Proc. of GECCO'09 (Companion), pages 2447--2452. ACM, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. A. Auger and N. Hansen. Linear convergence on positively homogeneous functions of a comparison based step-size adaptive randomized search: the (1+1) ES with generalized one-fifth success rule. CoRR, abs/1310.8397, 2013.Google ScholarGoogle Scholar
  3. S. Böttcher, B. Doerr, and F. Neumann. Optimal fixed and adaptive mutation rates for the leadingones problem. In Proc. of PPSN'10, volume 6238 of LNCS, pages 1--10. Springer, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. K. A. De Jong. An Analysis of the Behavior of a Class of Genetic Adaptive Systems. PhD thesis, 1975. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. L. Devroye. The compound random search. PhD thesis, 1972.Google ScholarGoogle Scholar
  6. B. Doerr. Analyzing randomized search heuristics: Tools from probability theory. In A. Auger and B. Doerr, editors, Theory of Randomized Search Heuristics, pages 1--20. World Scientific Publishing, 2011.Google ScholarGoogle ScholarCross RefCross Ref
  7. B. Doerr and C. Doerr. Optimal parameter choices through self-adjustment: Applying the 1/5-th rule in discrete settings. CoRR, abs/1504.03212, 2015. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. B. Doerr and C. Doerr. A tight bound for the runtime of the (1+(lambda, lambda)) genetic algorithm on onemax. In Proc. of GECCO'15. ACM, 2015. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. B. Doerr, C. Doerr, and F. Ebel. Lessons from the black-box: Fast crossover-based genetic algorithms. In Proc. of GECCO'13, pages 781--788. ACM, 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. B. Doerr, C. Doerr, and F. Ebel. From black-box complexity to designing new genetic algorithms. Theoretical Computer Science, 567:87--104, 2015. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. B. Doerr, T. Jansen, D. Sudholt, C. Winzen, and C. Zarges. Mutation rate matters even when optimizing monotonic functions. Evolutionary Computation, 21:1--27, 2013.Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. S. Droste, T. Jansen, and I. Wegener. On the analysis of the (1+1) evolutionary algorithm. Theoretical Computer Science, 276:51--81, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. S. Droste, T. Jansen, and I. Wegener. Upper and lower bounds for randomized search heuristics in black-box optimization. Theory of Computing Systems, 39:525--544, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. A. E. Eiben, R. Hinterding, and Z. Michalewicz. Parameter control in evolutionary algorithms. IEEE Transactions on Evolutionary Computation, 3:124--141, 1999. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. A. E. Eiben and J. E. Smith. Introduction to Evolutionary Computing. Springer Verlag, 2003. Google ScholarGoogle ScholarCross RefCross Ref
  16. N. Hansen, A. Gawelczyk, and A. Ostermeier. Sizing the population with respect to the local progress in (1,lambda)-evolution strategies - a theoretical analysis. In Proc. of CEC'95, pages 80--85. IEEE, 1995.Google ScholarGoogle Scholar
  17. J. He and X. Yao. Drift analysis and average time complexity of evolutionary algorithms. Artificial Intelligence, 127:57--85, 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. J. Jagersküpper. Rigorous runtime analysis of the (1+1) ES: 1/5-rule and ellipsoidal fitness landscapes. In Proc. of FOGA'05, volume 3469 of LNCS, pages 260--281. Springer, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. G. Karafotias, M. Hoogendoorn, and A. Eiben. Parameter control in evolutionary algorithms: Trends and challenges. IEEE Transactions on Evolutionary Computation, 19:167--187, 2015.Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. J. Lassig and D. Sudholt. Adaptive population models for offspring populations and parallel evolutionary algorithms. In Proc. of FOGA'11, pages 181--192. ACM, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. R. Motwani and P. Raghavan. Randomized Algorithms. Cambridge University Press, 1995. Google ScholarGoogle ScholarCross RefCross Ref
  22. I. Rechenberg. Evolutionsstrategie. Friedrich Fromman Verlag (Günther Holzboog KG), 1973.Google ScholarGoogle Scholar
  23. M. Schumer and K. Steiglitz. Adaptive step size random search. IEEE Transactions on Automatic Control, 13(3):270--276, 1968.Google ScholarGoogle ScholarCross RefCross Ref
  24. D. Sudholt. Crossover speeds up building-block assembly. In Proc. of GECCO'12, pages 689--702. ACM, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. O. Teytaud and S. Gelly. General lower bounds for evolutionary algorithms. In Proc. of PPSN'06, volume 4193 of LNCS, pages 21--31. Springer, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. A. C.-C. Yao. Probabilistic computations: Toward a unified measure of complexity. In Proc. of FOCS'77, pages 222--227. IEEE, 1977. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Optimal Parameter Choices Through Self-Adjustment: Applying the 1/5-th Rule in Discrete Settings

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      GECCO '15: Proceedings of the 2015 Annual Conference on Genetic and Evolutionary Computation
      July 2015
      1496 pages
      ISBN:9781450334723
      DOI:10.1145/2739480

      Copyright © 2015 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 11 July 2015

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      GECCO '15 Paper Acceptance Rate182of505submissions,36%Overall Acceptance Rate1,669of4,410submissions,38%

      Upcoming Conference

      GECCO '24
      Genetic and Evolutionary Computation Conference
      July 14 - 18, 2024
      Melbourne , VIC , Australia

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader