ABSTRACT
For genetic algorithms (GAs) using a bit-string representation of length n, the general recommendation is to take 1/n as mutation rate. In this work, we discuss whether this is justified for multi-modal functions. Taking jump functions and the (1+1) evolutionary algorithm (EA) as the simplest example, we observe that larger mutation rates give significantly better runtimes. For the Jumpm, n function, any mutation rate between 2/n and m/n leads to a speedup at least exponential in m compared to the standard choice.
The asymptotically best runtime, obtained from using the mutation rate m/n and leading to a speed-up super-exponential in m, is very sensitive to small changes of the mutation rate. Any deviation by a small (1 ± ε) factor leads to a slow-down exponential in m. Consequently, any fixed mutation rate gives strongly sub-optimal results for most jump functions.
Building on this observation, we propose to use a random mutation rate α/n, where α is chosen from a power-law distribution. We prove that the (1 + 1) EA with this heavy-tailed mutation rate optimizes any Jumpm, n function in a time that is only a small polynomial (in m) factor above the one stemming from the optimal rate for this m. Our heavy-tailed mutation operator yields similar speed-ups (over the best known performance guarantees) for the vertex cover problem in bipartite graphs and the matching problem in general graphs.
Following the example of fast simulated annealing, fast evolution strategies, and fast evolutionary programming, we propose to call genetic algorithms using a heavy-tailed mutation operator fast genetic algorithms.
- Anne Auger and Benjamin Doerr. 2011. Theory of Randomized Search Heuristics: Foundations and Recent Developments. World Scientific. Google ScholarDigital Library
- Thomas Bäck. 1993. Optimal mutation rates in genetic search. In ICGA. Morgan Kaufmann, 2--8. Google ScholarDigital Library
- Golnaz Badkobeh, Per Kristian Lehre, and Dirk Sudholt. 2014. Unbiased black-box complexity of parallel search. In Parallel Problem Solving from Nature XIII. Springer, 892--901.Google Scholar
- Süntje Böttcher, Benjamin Doerr, and Frank Neumann. 2010. Optimal fixed and adaptive mutation rates for the LeadingOnes problem. In PPSN (1) (Lecture Notes in Computer Science), Vol. 6238. Springer, 1--10. Google ScholarDigital Library
- Duc-Cuong Dang, Tobias Friedrich, Timo Kötzing, Martin S. Krejca, Per Kristian Lehre, Pietro Simone Oliveto, Dirk Sudholt, and Andrew M. Sutton. 2016. Emergence of diversity and its benefits for crossover in genetic algorithms. In PPSN (Lecture Notes in Computer Science), Vol. 9921. Springer, 890--900.Google Scholar
- Duc-Cuong Dang, Tobias Friedrich, Timo Kötzing, Martin S. Krejca, Per Kristian Lehre, Pietro Simone Oliveto, Dirk Sudholt, and Andrew M. Sutton. 2016. Escaping local optima with diversity mechanisms and crossover. In GECCO. ACM, 645--652. Google ScholarDigital Library
- Duc-Cuong Dang and Per Kristian Lehre. 2016. Runtime analysis of non-elitist populations: From classical optimisation to partial information. Algorithmica 75 (2016), 428--461. Google ScholarDigital Library
- Benjamin Doerr, Carola Doerr, and Franziska Ebel. 2015. From black-box complexity to designing new genetic algorithms. Theoretical Computer Science 567 (2015), 87--104. Google ScholarDigital Library
- Benjamin Doerr, Christian Gießen, Carsten Witt, and Jing Yang. 2017. The (1+λ) evolutionary algorithm with self-adjusting mutation rate. In GECCO. ACM. Google ScholarDigital Library
- Benjamin Doerr, Edda Happ, and Christian Klein. 2012. Crossover can provably be useful in evolutionary computation. Theoretical Computer Science 425 (2012), 17--33. Google ScholarDigital Library
- Benjamin Doerr, Daniel Johannsen, Timo Kötzing, Frank Neumann, and Madeleine Theile. 2013. More effective crossover operators for the all-pairs shortest path problem. Theoretical Computer Science 471 (2013), 12--26. Google ScholarDigital Library
- Benjamin Doerr and Marvin Künnemann. 2015. Optimizing linear functions with the (1+λ) evolutionary algorithm - Different asymptotic runtimes for different instances. Theoretical Computer Science 561 (2015), 3--23. Google ScholarDigital Library
- Benjamin Doerr, Huu Phuoc Le, Régis Makhmara, and Ta Duy Nguyen. 2017. Fast genetic algorithms. ArXiv e-prints (March 2017). arXiv:1703.03334Google Scholar
- Stefan Droste, Thomas Jansen, and Ingo Wegener. 2002. On the analysis of the (1+1) evolutionary algorithm. Theoretical Computer Science 276 (2002), 51--81. Google ScholarDigital Library
- Simon Fischer and Ingo Wegener. 2005. The one-dimensional Ising model: Mutation versus recombination. Theoretical Computer Science 344 (2005), 208--225. Google ScholarDigital Library
- Tobias Friedrich, Jun He, Nils Hebbinghaus, Frank Neumann, and Carsten Witt. 2010. Approximating covering problems by randomized search heuristics using multi-objective models. Evolutionary Computation 18 (2010), 617--633. Google ScholarDigital Library
- Oliver Giel and Ingo Wegener. 2003. Evolutionary algorithms and the maximum matching problem. In STACS (Lecture Notes in Computer Science), Vol. 2607. Springer, 415--426. Google ScholarDigital Library
- Oliver Giel and Ingo Wegener. 2004. Searching randomly for maximum matchings. Electronic Colloquium on Computational Complexity (ECCC) 076 (2004).Google Scholar
- Christian Gießen and Carsten Witt. 2015. Population size vs. mutation strength for the (1+λ) EA on OneMax. In GECCO. ACM, 1439--1446. Google ScholarDigital Library
- GitHub. 2017. Fast genetic algorithms. (2017). https://github.com/FastGA/fast-genetic-algorithms.Google Scholar
- Nikolaus Hansen, Fabian Gemperle, Anne Auger, and Petros Koumoutsakos. 2006. When do heavy-tail distributions help?. In PPSN (Lecture Notes in Computer Science), Vol. 4193. Springer, 62--71. Google ScholarDigital Library
- Thomas Jansen, Kenneth A. De Jong, and Ingo Wegener. 2005. On the choice of the offspring population size in evolutionary algorithms. Evolutionary Computation 13 (2005), 413--440. Google ScholarDigital Library
- Thomas Jansen and Ingo Wegener. 2005. Real royal road functions-where crossover provably is essential. Discrete Applied Mathematics 149 (2005), 111--125. Google ScholarDigital Library
- Thomas Jansen and Ingo Wegener. 2006. On the analysis of a dynamic evolutionary algorithm. J. Discrete Algorithms 4 (2006), 181--199.Google ScholarCross Ref
- Timo Kötzing, Dirk Sudholt, and Madeleine Theile. 2011. How crossover helps in pseudo-Boolean optimization. In GECCO. ACM, 989--996. Google ScholarDigital Library
- Heinz Mühlenbein. 1992. How genetic algorithms really work: Mutation and hillclimbing. In PPSN. Elsevier, 15--26.Google Scholar
- Frank Neumann and Ingo Wegener. 2007. Randomized local search, evolutionary algorithms, and the minimum spanning tree problem. Theoretical Computer Science 378 (2007), 32--40. Google ScholarDigital Library
- Frank Neumann and Carsten Witt. 2010. Bioinspired Computation in Combinatorial Optimization: Algorithms and Their Computational Complexity. Springer Berlin Heidelberg. Google ScholarDigital Library
- Petr Posik. 2009. BBOB-benchmarking a simple estimation of distribution algorithm with Cauchy distribution. In GECCO (Companion). ACM, 2309--2314. Google ScholarDigital Library
- Petr Posík. 2010. Comparison of Cauchy EDA and BIPOP-CMA-ES algorithms on the BBOB noiseless testbed. In GECCO (Companion). ACM, 1697--1702. Google ScholarDigital Library
- Günter Rudolph. 1997. Convergence Properties of Evolutionary Algorithms. Kovac.Google Scholar
- Jens Scharnow, Karsten Tinnefeid, and Ingo Wegener. 2004. The analysis of evolutionary algorithms on sorting and shortest paths problems. J. Math. Model. Algorithms 3 (2004), 349--366.Google ScholarCross Ref
- Tom Schaul, Tobias Glasmachers, and Jürgen Schmidhuber. 2011. High dimensions and heavy tails for natural evolution strategies. In GECCO. ACM, 845--852. Google ScholarDigital Library
- Tobias Storch and Ingo Wegener. 2004. Real royal road functions for constant population size. Theoretical Computer Science 320 (2004), 123--134. Google ScholarDigital Library
- Dirk Sudholt. 2005. Crossover is provably essential for the ising model on trees. In GECCO. ACM, 1161--1167. Google ScholarDigital Library
- Harold H. Szu and Ralph L. Hartley. 1987. Fast simulated annealing. Physics Letters A 122 (June 1987), 157--162.Google Scholar
- Ingo Wegener. 2001. Theoretical aspects of evolutionary algorithms. In ICALP (Lecture Notes in Computer Science), Vol. 2076. Springer, 64--78. Google ScholarDigital Library
- Carsten Witt. 2005. Worst-case and average-case approximations by simple randomized search heuristics. In STACS (Lecture Notes in Computer Science), Vol. 3404. Springer, 44--56. Google ScholarDigital Library
- Carsten Witt. 2006. Runtime analysis of the (μ + 1) EA on simple pseudo-Boolean functions. Evolutionary Computation 14 (2006), 65--86. Google ScholarDigital Library
- Carsten Witt. 2013. Tight bounds on the optimization time of a randomized search heuristic on linear functions. Combinatorics, Probability & Computing 22 (2013), 294--318.Google ScholarCross Ref
- Xin Yao and Yong Liu. 1997. Fast evolution strategies. In Evolutionary Programming (Lecture Notes in Computer Science), Vol. 1213. Springer, 151--162. Google ScholarDigital Library
- Xin Yao, Yong Liu, and Guangming Lin. 1999. Evolutionary programming made faster. IEEE Trans. Evolutionary Computation 3 (1999), 82--102. Google ScholarDigital Library
Index Terms
- Fast genetic algorithms
Recommendations
Improved biogeography-based optimisation
Biogeography-based optimisation (BBO) is one of the popular evolutionary algorithms, inspired by the theory of island biogeography. It has been successfully applied in various real world optimisation problems such as image segmentation, data clustering, ...
Ensemble of niching algorithms
Although niching algorithms have been investigated for almost four decades as effective procedures to obtain several good and diverse solutions of an optimization problem, no effort has been reported on combining different niching algorithms to form an ...
Two-Loop Real-Coded Genetic Algorithms with Adaptive Control of Mutation Step Sizes
Genetic algorithms are adaptive methods based on natural evolution that may be used for search and optimization problems. They process a population of search space solutions with three operations: selection, crossover, and mutation. Under their initial ...
Comments