ABSTRACT
The core of hypervolume-based multi-objective evolutionary algorithms is an archiving algorithm which performs the environmental selection. A (μ+λ)-archiving algorithm defines how to choose μ children from μ parents and λ offspring together. We study theoretically (μ+λ)-archiving algorithms which never decrease the hypervolume from one generation to the next. Zitzler, Thiele, and Bader (IEEE Trans. Evolutionary Computation, 14:58--79, 2010) proved that all (μ+1)-archiving algorithms are ineffective, which means there is an initial population such that independent of the used reproduction rule, a set with maximum hypervolume cannot be reached. We extend this and prove that for λ<μ all archiving algorithms are ineffective. On the other hand, locally optimal algorithms, which maximize the hypervolume in each step, are effective for λ=μ and can always find a population with hypervolume at least half the optimum for λ < μ.
We also prove that there is no hypervolume-based archiving algorithm which can always find a population with hypervolume greater than 1 / (1 + 0.1338, ( 1/λ - 1/μ) ) times the optimum.
- N. Beume, B. Naujoks, and M. T. M. Emmerich. SMS-EMOA: Multiobjective selection based on dominated hypervolume. European Journal of Operational Research, 181: 1653--1669, 2007.Google ScholarCross Ref
- N. Beume, B. Naujoks, M. Preuss, G. Rudolph, and T. Wagner. Effects of 1-greedy $\mathcalS$-metric-selection on innumerably large Pareto fronts. In Proc. 5th International Conference on Evolutionary Multi-Criterion Optimization (EMO '09), Vol. 5467 of LNCS, pp. 21--35, 2009. Google ScholarDigital Library
- K. Bringmann and T. Friedrich. Approximating the volume of unions and intersections of high-dimensional geometric objects. Computational Geometry: Theory and Applications, 43: 601--610, 2010. Google ScholarDigital Library
- K. Bringmann and T. Friedrich. Approximating the least hypervolume contributor: NP-hard in general, but fast in practice. In Proc. 5th International Conference on Evolutionary Multi-Criterion Optimization (EMO '09), pp. 6--20. Springer, 2009. Google ScholarDigital Library
- D. W. Corne and J. D. Knowles. Some multiobjective optimizers are better than others. In Proc. Congress on Evolutionary Computation (CEC '03), pp. 2506--2512. IEEE Press, 2003.Google ScholarCross Ref
- T. Hanne. On the convergence of multiobjective evolutionary algorithms. European Journal of Operational Research, 117: 553--564, 1999.Google ScholarCross Ref
- C. Igel, N. Hansen, and S. Roth. Covariance matrix adaptation for multi-objective optimization. Evolutionary Computation, 15: 1--28, 2007. Google ScholarDigital Library
- J. Knowles and D. Corne. Metaheuristics for Multiobjective Optimisation, Vol. 535 of Lecture Notes in Economics and Mathematical Systems, chapter Bounded Pareto Archiving: Theory and Practice, pp. 39--64. Springer, 2004.Google Scholar
- J. D. Knowles and D. Corne. Properties of an adaptive archiving algorithm for storing nondominated vectors. IEEE Trans. Evolutionary Computation, 7: 100--116, 2003. Google ScholarDigital Library
- J. D. Knowles, D. W. Corne, and M. Fleischer. Bounded archiving using the Lebesgue measure. In Proc. IEEE Congress on Evolutionary Computation (CEC '03), Vol. 4, pp. 2490--2497. IEEE Press, 2003.Google ScholarCross Ref
- M. Laumanns, L. Thiele, K. Deb, and E. Zitzler. Combining convergence and diversity in evolutionary multiobjective optimization. Evolutionary Computation, 10(3): 263--282, 2002. Google ScholarDigital Library
- M. López-Ibánez, J. D. Knowles, and M. Laumanns. On sequential online archiving of objective vectors. In Proc. 6th International Conference on Evolutionary Multi-Criterion Optimization (EMO '11), Vol. 6576 of phLNCS, pp. 46--60. Springer, 2011. Google ScholarDigital Library
- G. Rudolph and A. Agapie. Convergence properties of some multi-objective evolutionary algorithms. In Proc. Congress on Evolutionary Computation (CEC '00), pp. 1010--1016. IEEE Press, 2002.Google Scholar
- O. Schütze, M. Laumanns, C. A. C. Coello, M. Dellnitz, and E.-G. Talbi. Convergence of stochastic search algorithms to finite size pareto set approximations. Journal of Global Optimization, 41: 559--577, 2008. Google ScholarDigital Library
- T. Suttorp, N. Hansen, and C. Igel. Efficient covariance matrix update for variable metric evolution strategies. Machine Learning, 75: 167--197, 2009. Google ScholarDigital Library
- E. Zitzler and S. Künzli. Indicator-based selection in multiobjective search. In Proc. 8th International Conference Parallel Problem Solving from Nature (PPSN VIII), Vol. 3242 of LNCS, pp. 832--842. Springer, 2004.Google ScholarCross Ref
- E. Zitzler and L. Thiele. Multiobjective evolutionary algorithms: a comparative case study and the strength Pareto approach. IEEE Trans. Evolutionary Computation, 3: 257--271, 1999. Google ScholarDigital Library
- E. Zitzler, L. Thiele, and J. Bader. On set-based multiobjective optimization. IEEE Trans. Evolutionary Computation, 14: 58--79, 2010. Google ScholarDigital Library
Index Terms
- Convergence of hypervolume-based archiving algorithms I: effectiveness
Recommendations
Convergence of hypervolume-based archiving algorithms ii: competitiveness
GECCO '12: Proceedings of the 14th annual conference on Genetic and evolutionary computationWe study the convergence behavior of (μ+λ)-archiving algorithms. A (μ+λ)-archiving algorithm defines how to choose in each generation μ children from μ parents and λ offspring together. Archiving algorithms have to choose individuals online without ...
The logarithmic hypervolume indicator
FOGA '11: Proceedings of the 11th workshop proceedings on Foundations of genetic algorithmsIt was recently proven that sets of points maximizing the hypervolume indicator do not give a good multiplicative approximation of the Pareto front. We introduce a new "logarithmic hypervolume indicator" and prove that it achieves a close-to-optimal ...
Optimal µ-distributions for the hypervolume indicator for problems with linear bi-objective fronts: exact and exhaustive results
SEAL'10: Proceedings of the 8th international conference on Simulated evolution and learningTo simultaneously optimize multiple objective functions, several evolutionary multiobjective optimization (EMO) algorithms have been proposed. Nowadays, often set quality indicators are used when comparing the performance of those algorithms or when ...
Comments