skip to main content
10.1145/1276958.1277203acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
Article

A study on metamodeling techniques, ensembles, and multi-surrogates in evolutionary computation

Published:07 July 2007Publication History

ABSTRACT

Surrogate-Assisted Memetic Algorithm (SAMA) is a hybrid evolutionary algorithm, particularly a memetic algorithm that employs surrogate models in the optimization search. Since most of the objective function evaluations in SAMA are approximated, the search performance of SAMA is likely to be affected by the characteristics of the models used. In this paper, we study the search performance of using different meta modeling techniques, ensembles, and multi-surrogates in SAMA. In particular, we consider the SAMA-TRF, a SAMA model management framework that incorporates a trust region scheme for interleaving use of exact objective function with computationally cheap local meta models during local searches. Four different metamodels, namely Gaussian Process (GP), Radial Basis Function (RBF), Polynomial Regression (PR), and Extreme Learning Machine (ELM) neural network are used in the study. Empirical results obtained show that while some metamodeling techniques perform best on particular benchmark problems, ensemble of metamodels and multisurrogates yield robust and improved solution quality on the benchmark problems in general, for the same computational budget.

References

  1. M. K. Karakasis, A. P. Giotis, and K. C. Giannakoglou. Inexact information aided, low-cost, distributed genetic algorithms for aerodynamic shape optimization. International Journal for Numerical Methods in Fluids, 43, pp. 1149--1166, 2003.Google ScholarGoogle ScholarCross RefCross Ref
  2. Y. S. Ong and A.J. Keane. A domain knowledge based search advisor for design problem solving environments. Engineering Applications of Artificial Intelligence, 15(1), pp. 105--116, 2002.Google ScholarGoogle ScholarCross RefCross Ref
  3. M. Olhofer, T. Arima, T. Sonoda, and B. Sendhoff. Optimization of a stator blade used in a transonic compressor cascade with evolution strategies. Adaptive Computing in Design and Manufacture (ACDM), Springer Verlag, pp. 45--54, 2000.Google ScholarGoogle Scholar
  4. K. C. Tan, R. Sathikannan, W. W. Tan, and A. P. Loh. Evolutionary design and implementation of a hard disk drive servo control system. Soft Computing, 11(2), pp. 131--139, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. K. C. Tan, E. F. Khor, J. Cai, C. M. Heng, and T. H. Lee. Automating the drug scheduling of cancer chemotherapy via evolutionary computation. Artificial Intelligence in Medicine, 25, pp. 169--185, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. A. Ratle. Kriging as a surrogate fitness landscape in evolutionary optimization. Artificial Intelligence for Engineering Design Analysis and Manufacturing, 15(1), pp. 37--49, 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Y. Jin. A comprehensive survey of fitness approximation in evolutionary computation. Soft Computing, 9(1), pp. 3--12, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. M. A. El-Beltagy, P. B. Nair, A. J. Keane. Metamodeling techniques for evolutionary optimization of computationally expensive problems: promise and limitations. Proc. of the GECCO 2003, pp. 196--203, 1999.Google ScholarGoogle Scholar
  9. Y. Jin, M. Olhofer, B. Sendhoff. A framework for evolutionary optimization with approximate fitness function. IEEE Transactions on Evolutionary Computation, 6(5), pp. 481--494, October 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. H. Ulmer, F. Streichert, and A. Zell. Evolution strategies assisted by gaussian processes with improved pre-selection criterion. Proc. of IEEE CEC 2003, pp. 692--699, 2003.Google ScholarGoogle Scholar
  11. Y. S. Ong, P. B. Nair, and A. J. Keane. Evolutionary optimization of computationally expensive problems via surrogate modeling. AIAA Journal, 41(4), pp. 687--696, 2003.Google ScholarGoogle ScholarCross RefCross Ref
  12. J. F. Rodriguez, J. E. Renaud, and L. T. Watson. Convergence of trust region augmented lagrangian methods using variable fidelity approximation data. Structural Optimization, 15(3-4), pp. 141--156, 1998.Google ScholarGoogle ScholarCross RefCross Ref
  13. Y. S. Ong, P. B. Nair, K. Y. Lum. Evolutionary algorithm with hermite radial basis function interpolants for computationally expensive adjoint solvers. Computational Optimization and Applications, 2006. (In Press) Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Z. Zhou, Y. S. Ong, P. B. Nair, A. J. Keane, and K. Y. Lum. Combining Global and Local Surrogate Models to Accelerate Evolutionary Optimization. IEEE Transactions on Systems, Man and Cybernetics(SMC), part C, 37(1), pp. 66--76, Jan 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Y. S. Ong and A. J. Keane. Meta-lamarckian learning in memetic algorithm. IEEE Transactions on Evolutionary Computation, 8(2), pp. 99--110, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Y. S. Ong, P. B. Nair, and A. J. Keane, and K. W. Wong. Surrogate-assisted evolutionary optimization frameworks for high-fidelity engineering design problems. Y. Jin, editor, Knowledge Incorporation in Evolutionary Computation, Studies in Fuzziness and Soft Computing, pp. 307--332. 2004.Google ScholarGoogle Scholar
  17. N. Alexandrov, J. E. Dennis, R. M. Lewis, and V. Torczon. A trust region framework for managing the use of approximation models in optimization. Journal on Structural Optimization, 15(1), pp. 16--23, 1998.Google ScholarGoogle ScholarCross RefCross Ref
  18. C. T. Lawrence and A. L. Tits. Nonlinear equality constraints in feasible sequential quadratic probramming. Optimization Methods and Software, 6, pp. 265--282, 1996.Google ScholarGoogle ScholarCross RefCross Ref
  19. D. J. C. Mackay. Introduction to gaussian processes. Neural Networks and Machine Learning, 168, pp. 133--165, 1998.Google ScholarGoogle Scholar
  20. C. Bishop. Neural networks for pattern recognition. Oxford University Press, 1995. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Fred H. Lesh. Multi-dimensional least-square polynomial curve fitting. Communications of ACM, 2(9), pp. 29--30, 1959. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. G. B. Huang, Q. Y. Zhu, C. K. Siew. Extreme learning machine: a new learning scheme of feedforward neural networks. Proc. of IJCNN 2004, 2, pp. 985--990, 2004.Google ScholarGoogle Scholar
  23. S. D. Handoko, C. K. Kwoh, Y. S. Ong, L. Z. Guang and B. Vladimir. Extreme learning machine for predicting HLA-peptide binding. Third ISNN, Chengdu, May 28-June 1, LNCS 3973, pp. 716--721, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. K. H. Liang, X. Yao, and C. Newton. Evolutionary search of approximated n-dimensional landscapes. International Journal of Knowledge-Based Intelligent Engineering Systems, vol. 4, no. 3, pp. 172--183, 2000.Google ScholarGoogle Scholar
  25. Y. S. Ong, Z. Zhou, and D. Lim. Curse and blessing of uncertainty in evolutionary algorithm using approximation. Proc. of The 2006 IEEE CEC 2006, pp. 2928--2935, Vancouver, July 2006.Google ScholarGoogle Scholar
  26. R. Jin, W. Chen and T. W. Simpson, 2001. Comparative studies of metamodeling techniques under multiple modeling criteria. Journal of Structural Optimization, Vol. 23, No. 1, pp. 1--13, 2001.Google ScholarGoogle ScholarCross RefCross Ref
  27. Z. Z. Zhou, Y. S. Ong, M. H. Lim and B. S. Lee. Memetic Algorithm using Multi-Surrogates for Computationally Expensive Optimization Problems. Soft Computing Journal, ISSN 1432--7643 (Print) 1433--7479 (Online), December 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Y. Jin and B. Sendhoff. Reducing fitness evaluations using clustering techniques and neural networks ensembles. GECCO 2004, LNCS 3102, pp. 688--699, 2004.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. A study on metamodeling techniques, ensembles, and multi-surrogates in evolutionary computation

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        GECCO '07: Proceedings of the 9th annual conference on Genetic and evolutionary computation
        July 2007
        2313 pages
        ISBN:9781595936974
        DOI:10.1145/1276958

        Copyright © 2007 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 7 July 2007

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • Article

        Acceptance Rates

        GECCO '07 Paper Acceptance Rate266of577submissions,46%Overall Acceptance Rate1,669of4,410submissions,38%

        Upcoming Conference

        GECCO '24
        Genetic and Evolutionary Computation Conference
        July 14 - 18, 2024
        Melbourne , VIC , Australia

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader