ABSTRACT
First, the covariance matrix adaptation (CMA) with rank-one update is introduced into the (1+1)-evolution strategy. An improved implementation of the 1/5-th success rule is proposed for step size adaptation, which replaces cumulative path length control. Second, an incremental Cholesky update for the covariance matrix is developed replacing the computational demanding and numerically involved decomposition of the covariance matrix. The Cholesky update can replace the decomposition only for the update without evolution path and reduces the computational effort from O(n3) to O(n2). The resulting (1+1)-Cholesky-CMA-ES is an elegant algorithm and the perhaps simplest evolution strategy with covariance matrix and step size adaptation. Simulations compare the introduced algorithms to previously published CMA versions.
- D. H. Ackley. A connectionist machine for genetic hillclimbing. Kluwer, Boston, 1987. Google ScholarDigital Library
- T. Bäck. Evolutionary algorithms in theory and practice. Oxford University Press, 1996. Google ScholarDigital Library
- H.-G. Beyer and H.-P. Schwefel. Evolution strategies: A comprehensive introduction. Natural Computing, 1(1):3--52, 2002. Google ScholarDigital Library
- M. S. Grewal and A. P. Andrews. Kalman Filtering: Theory and Practice. Prentice-Hall, 1993. Google ScholarDigital Library
- N. Hansen. An analysis of mutative σ-self-adaptation on linear fitness functions. Evolutionary Computation, 14(3), 2006. Google ScholarDigital Library
- N. Hansen and S. Kern. Evaluating the CMA evolution strategy on multimodal test functions. In X. Yao, E. Burke, J. A. Lozano, J. Smith, J. J. Merelo-Guervós, J. A. Bullinaria, J. Rowe, P. Tivño, A. Kabán, and H.-P. Schwefel, editors, Parallel Problem Solving from Nature (PPSN VIII), volume 3242 of LNCS, pages 282--291. Springer-Verlag, 2004.Google ScholarCross Ref
- N. Hansen, S. D. Müller, and P. Koumoutsakos. Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (CMA-ES). Evolutionary Computation, 11(1):1--18, 2003. Google ScholarDigital Library
- N. Hansen and A. Ostermeier. Adapting arbitrary normal mutation distributions in evolution strategies: The covariance matrix adaptation. In Proceedings of the 1996 IEEE Conference on Evolutionary Computation (ICEC '96), pages 312--317, 1996. Google ScholarDigital Library
- N. Hansen and A. Ostermeier. Completely derandomized self-adaptation in evolution strategies. Evolutionary Computation, 9(2):159--195, 2001. Google ScholarDigital Library
- S. Kern, S. Müller, N. Hansen, D. Büche, J. Ocenasek, and P. Koumoutsakos. Learning probability distributions in continuous evolutionary algorithms - A comparative review. Natural Computing, 3:77--112, 2004. Google ScholarDigital Library
- H. Mühlenbein, D. Schomisch, and J. Born. The Parallel Genetic Algorithm as Function Optimizer. Parallel Computing, 17(6-7):619--632, 1991.Google ScholarDigital Library
- I. Rechenberg. Evolutionsstrategie: Optimierung Technischer Systeme nach Prinzipien der Biologischen Evolution. Frommann-Holzboog, 1973.Google Scholar
- G. Rudolph. On correlated mutations in evolution strategies. In R. Männer and B. Manderick, editors, Parallel Problem Solving from Nature 2 (PPSN II), pages 105--114. Elsevier, 1992.Google Scholar
- H.-P. Schwefel. Evolution and Optimum Seeking. Sixth-Generation Computer Technology Series. John Wiley & Sons, 1995. Google ScholarDigital Library
- A. Törn and A. Zilinskas. Global Optimization, volume 350 of LNCS. Springer-Verlag, 1989. Google ScholarDigital Library
Index Terms
- A computational efficient covariance matrix update and a (1+1)-CMA for evolution strategies
Recommendations
A More Efficient Rank-one Covariance Matrix Update for Evolution Strategies
FOGA '15: Proceedings of the 2015 ACM Conference on Foundations of Genetic Algorithms XIIILearning covariance matrices of Gaussian distributions is at the heart of most variable-metric randomized algorithms for continuous optimization. If the search space dimensionality is high, updating the covariance or its factorization is computationally ...
Efficient covariance matrix update for variable metric evolution strategies
Randomized direct search algorithms for continuous domains, such as evolution strategies, are basic tools in machine learning. They are especially needed when the gradient of an objective function (e.g., loss, energy, or reward function) cannot be ...
Projection-Based Restricted Covariance Matrix Adaptation for High Dimension
GECCO '16: Proceedings of the Genetic and Evolutionary Computation Conference 2016We propose a novel variant of the covariance matrix adaptation evolution strategy (CMA-ES) using a covariance matrix parameterized with a smaller number of parameters. The motivation of a restricted covariance matrix is twofold. First, it requires less ...
Comments