skip to main content
10.1145/1830483.1830573acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
research-article

Improved step size adaptation for the MO-CMA-ES

Published:07 July 2010Publication History

ABSTRACT

The multi-objective covariance matrix adaptation evolution strategy (MO-CMA-ES) is an evolutionary algorithm for continuous vector-valued optimization. It combines indicator-based selection based on the contributing hypervolume with the efficient strategy parameter adaptation of the elitist covariance matrix adaptation evolution strategy (CMA-ES). Step sizes (i.e., mutation strengths) are adapted on individual-level using an improved implementation of the 1/5-th success rule. In the original MO-CMA-ES, a mutation is regarded as successful if the offspring ranks better than its parent in the elitist, rank-based selection procedure. In contrast, we propose to regard a mutation as successful if the offspring is selected into the next parental population. This criterion is easier to implement and reduces the computational complexity of the MO-CMA-ES, in particular of its steady-state variant. The new step size adaptation improves the performance of the MO-CMA-ES as shown empirically using a large set of benchmark functions. The new update scheme in general leads to larger step sizes and thereby counteracts premature convergence. The experiments comprise the first evaluation of the MO-CMA-ES for problems with more than two objectives.

References

  1. N. Beume. S-metric calculation by considering dominated hypervolume as Klee's measure problem. Evolutionary Computation, 17(4):477--492, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. N. Beume, B. Naujoks, and M. Emmerich. SMS-EMOA: Multiobjective selection based on dominated hypervolume. European Journal of Operational Research, 181(3):1653--1669, 2007.Google ScholarGoogle ScholarCross RefCross Ref
  3. N. Beume and G. Rudolph. Faster S-metric calculation by considering dominated hypervolume as Klee's measure problem. In IASTED International Conference on Computational Intelligence, pages 231--236. ACTA Press, 2006.Google ScholarGoogle Scholar
  4. S. Bleuler, M. Laumanns, L. Thiele, and E. Zitzler. PISA - A platform and programming language independent interface for search algorithms. In C. M. Fonseca, P. J. Fleming, E. Zitzler, K. Deb, and L. Thiele, editors, Evolutionary Multi-Criterion Optimization (EMO 2003), volume 2632 of LNCS, pages 494--508. Springer-Verlag, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. K. Bringmann and T. Friedrich. Approximating the least hypervolume contributor: NP-Hard in General, But Fast in Practice. In M. Ehrgott, C. M. Fonseca, X. Gandibleux, J.-K. Hao, and M. Sevaux, editors, EMO, volume 5467 of Lecture Notes in Computer Science, pages 6--20. Springer, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. K. Deb, A. Pratap, S. Agarwal, and T. Meyarivan. A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Transactions on Evolutionary Computation, 6:182--197, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. K. Deb, L. Thiele, M. Laumanns, and E. Zitzler. Scalable multi-objective optimization test problems. In Congress on Evolutionary Computation (CEC), pages 825--830. IEEE Press, 2002.Google ScholarGoogle ScholarCross RefCross Ref
  8. J. Demšar. Statistical comparisons of classifiers over multiple data sets. Journal of Machine Learning Research, 7:1--30, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. S. García and F. Herrera. An extension on "statistical comparisons of classifiers over multiple data sets" for all pairwise comparisons. Journal of Machine Learning Research, 9:2677--2694, 2008.Google ScholarGoogle Scholar
  10. S. García, D. Molina, M. Lozano, and F. Herrera. A study on the use of non-parametric tests for analyzing the evolutionary algorithms’ behaviour: A case study on the CEC’2005 special session on real parameter optimization. Journal of Heuristics, 15:617--644, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. N. Hansen, S. D. Müller, and P. Koumoutsakos. Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (CMA-ES). Evolutionary Computation, 11(1):1--18, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. N. Hansen and A. Ostermeier. Completely derandomized self-adaptation in evolution strategies. Evolutionary Computation, 9(2):159--195, 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. C. Igel, T. Glasmachers, and V. Heidrich-Meisner. Shark. Journal of Machine Learning Research, 9:993--996, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. C. Igel, N. Hansen, and S. Roth. Covariance matrix adaptation for multi-objective optimization. Evolutionary Computation, 15(1):1--28, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. C. Igel, T. Suttorp, and N. Hansen. A computational efficient covariance matrix update and a (1+1)-CMA for evolution strategies. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO 2006), pages 453--460. ACM Press, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. C. Igel, T. Suttorp, and N. Hansen. Steady-state selection and efficient covariance matrix update in the multi-objective CMA-ES. In Fourth International Conference on Evolutionary Multi-Criterion Optimization (EMO 2007), volume 4403 of LNCS. Springer-Verlag, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. S. Kern, S. D. Müller, N. Hansen, D. Büche, J. Ocenasek, and P. Koumoutsakos. Learning probability distributions in continuous evolutionary algorithms - a comparative review. Natural Computing, 3:77--112, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. I. Rechenberg. Evolutionsstrategie: Optimierung technischer Systeme nach Prinzipien der biologischen Evolution. Frommann-Holzboog, 1973.Google ScholarGoogle Scholar
  19. T. Suttorp, N. Hansen, and C. Igel. Efficient covariance matrix update for variable metric evolution strategies. Machine Learning, 75(2):167--197, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. L. G. Valiant. The complexity of computing the permanent. Theoretical Computer Science, 8:189--201, 1979.Google ScholarGoogle ScholarCross RefCross Ref
  21. T. Voß, N. Beume, G. Rudolph, and C. Igel. Scalarization versus indicator-based selection in multi-objective CMA evolution strategies. In IEEE Congress on Evolutionary Computation 2008 (CEC 2008), pages 3041--3048. IEEE Press, 2008.Google ScholarGoogle ScholarCross RefCross Ref
  22. T. Voß, N. Hansen, and C. Igel. Recombination for learning strategy parameters in the MO-CMA-ES. In M. Ehrgott, C. Fonseca, X. Gandibleux, J.-K. Hao, and M. Sevaux, editors, Fifth International Conference on Evolutionary Multi-Criterion Optimization (EMO 2009), volume 5467 of LNCS. Springer-Verlag, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. L. While. A new analysis of the LebMeasure algorithm for calculating hypervolume. In C. A. Coello Coello, E. Zitzler, and A. Hernandez Aguirre, editors, Third International Conference on Evolutionary Multi-Criterion Optimization (EMO 2005), volume 3410 of LNCS, pages 326--340. Springer-Verlag, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. E. Zitzler. Hypervolume metric calculation. Technical report, Computer Engineering and Networks Laboratory, (TIK), Swiss Federal Institute of Technology (ETH) Zurich, 2001.Google ScholarGoogle Scholar
  25. E. Zitzler, K. Deb, and L. Thiele. Comparison of multiobjective evolutionary algorithms: Empirical results. Evolutionary Computation, 8(2):173--195, 2000. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. E. Zitzler and L. Thiele. Multiobjective optimization using evolutionary algorithms - a comparative case study. In A. E. Eiben, T. Bäck, M. Schoenauer, and H.-P. Schwefel, editors, Fifth International Conference on Parallel Problem Solving from Nature (PPSN-V), volume 1498 of LNCS, pages 292--301. Springer-Verlag, 1998. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. E. Zitzler, L. Thiele, M. Laumanns, C. M. Fonseca, and V. Grunert da Fonseca. Performance assessment of multiobjective optimizers: An analysis and review. IEEE Transactions on Evolutionary Computation, 7(2):117--132, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Improved step size adaptation for the MO-CMA-ES

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Conferences
          GECCO '10: Proceedings of the 12th annual conference on Genetic and evolutionary computation
          July 2010
          1520 pages
          ISBN:9781450300728
          DOI:10.1145/1830483

          Copyright © 2010 ACM

          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 7 July 2010

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • research-article

          Acceptance Rates

          Overall Acceptance Rate1,669of4,410submissions,38%

          Upcoming Conference

          GECCO '24
          Genetic and Evolutionary Computation Conference
          July 14 - 18, 2024
          Melbourne , VIC , Australia

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader