skip to main content
10.1145/3067695.3075987acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
poster

Introducing the cumulation to the population based incremental learning and the compact GA to relax genetic drift

Published: 15 July 2017 Publication History

Abstract

The population based incremental learning (PBIL) and the compact genetic algorithm (cGA) are both Bernoulli distribution based search algorithm for optimization of binary variables. The probability parameter that controls the probability of each bit being one is updated based on the ranking of the population from the current distribution. We often observe some parameters move randomly and tend to converge towards undesired values. Such a behavior is called genetic drift and it happens due to a large variance of the parameter update compared to a small expectation of the parameter update. To prevent genetic drift, the learning rate for the parameter update needs to be sufficiently small, but it results in a slow convergence. In this paper, we propose a mechanism to detect genetic drift and prevent parameters from being updated randomly. For this purpose, we introduce the so-called cumulation that is employed in the covariance matrix adaptation evolution strategy. Experimental results show that the cumulation allows PBIL and cGA to use a greater learning rate for the parameter update, leading to a speed up, especially on functions with high redundancy such as LeadingOnes function.

References

[1]
Shumeet Baluja and Rich Caruana. 1995. Removing the genetics from the standard genetic algorithm. In Proc. of the 12th Intern. Conf. on Machine Learning. 38--46.
[2]
Nikolaus Hansen and Andreas Ostermeier. 2001. Completely derandomized self-adaptation in evolution strategies. Evolutionary Computation 9, 2 (2001), 159--195.
[3]
Georges R Harik, Fernando G Lobo, and David E Goldberg. 1999. The compact genetic algorithm. IEEE transactions on evolutionary computation 3, 4 (1999), 287--297.
[4]
Dirk Sudholt and Carsten Witt. 2016. Update strength in EDAs and ACO: How to avoid genetic drift. In Proceedings of the 2016 on Genetic and Evolutionary Computation Conference. ACM, 61--68.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
GECCO '17: Proceedings of the Genetic and Evolutionary Computation Conference Companion
July 2017
1934 pages
ISBN:9781450349390
DOI:10.1145/3067695
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 15 July 2017

Check for updates

Author Tags

  1. binary optimization
  2. compact genetic algorithm
  3. cumulation
  4. genetic drift
  5. population based incremental learning

Qualifiers

  • Poster

Conference

GECCO '17
Sponsor:

Acceptance Rates

Overall Acceptance Rate 1,669 of 4,410 submissions, 38%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 70
    Total Downloads
  • Downloads (Last 12 months)1
  • Downloads (Last 6 weeks)0
Reflects downloads up to 02 Mar 2025

Other Metrics

Citations

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media