ABSTRACT
Surrogate-Assisted Memetic Algorithm (SAMA) is a hybrid evolutionary algorithm, particularly a memetic algorithm that employs surrogate models in the optimization search. Since most of the objective function evaluations in SAMA are approximated, the search performance of SAMA is likely to be affected by the characteristics of the models used. In this paper, we study the search performance of using different meta modeling techniques, ensembles, and multi-surrogates in SAMA. In particular, we consider the SAMA-TRF, a SAMA model management framework that incorporates a trust region scheme for interleaving use of exact objective function with computationally cheap local meta models during local searches. Four different metamodels, namely Gaussian Process (GP), Radial Basis Function (RBF), Polynomial Regression (PR), and Extreme Learning Machine (ELM) neural network are used in the study. Empirical results obtained show that while some metamodeling techniques perform best on particular benchmark problems, ensemble of metamodels and multisurrogates yield robust and improved solution quality on the benchmark problems in general, for the same computational budget.
- M. K. Karakasis, A. P. Giotis, and K. C. Giannakoglou. Inexact information aided, low-cost, distributed genetic algorithms for aerodynamic shape optimization. International Journal for Numerical Methods in Fluids, 43, pp. 1149--1166, 2003.Google ScholarCross Ref
- Y. S. Ong and A.J. Keane. A domain knowledge based search advisor for design problem solving environments. Engineering Applications of Artificial Intelligence, 15(1), pp. 105--116, 2002.Google ScholarCross Ref
- M. Olhofer, T. Arima, T. Sonoda, and B. Sendhoff. Optimization of a stator blade used in a transonic compressor cascade with evolution strategies. Adaptive Computing in Design and Manufacture (ACDM), Springer Verlag, pp. 45--54, 2000.Google Scholar
- K. C. Tan, R. Sathikannan, W. W. Tan, and A. P. Loh. Evolutionary design and implementation of a hard disk drive servo control system. Soft Computing, 11(2), pp. 131--139, 2007. Google ScholarDigital Library
- K. C. Tan, E. F. Khor, J. Cai, C. M. Heng, and T. H. Lee. Automating the drug scheduling of cancer chemotherapy via evolutionary computation. Artificial Intelligence in Medicine, 25, pp. 169--185, 2002. Google ScholarDigital Library
- A. Ratle. Kriging as a surrogate fitness landscape in evolutionary optimization. Artificial Intelligence for Engineering Design Analysis and Manufacturing, 15(1), pp. 37--49, 2001. Google ScholarDigital Library
- Y. Jin. A comprehensive survey of fitness approximation in evolutionary computation. Soft Computing, 9(1), pp. 3--12, 2005. Google ScholarDigital Library
- M. A. El-Beltagy, P. B. Nair, A. J. Keane. Metamodeling techniques for evolutionary optimization of computationally expensive problems: promise and limitations. Proc. of the GECCO 2003, pp. 196--203, 1999.Google Scholar
- Y. Jin, M. Olhofer, B. Sendhoff. A framework for evolutionary optimization with approximate fitness function. IEEE Transactions on Evolutionary Computation, 6(5), pp. 481--494, October 2002. Google ScholarDigital Library
- H. Ulmer, F. Streichert, and A. Zell. Evolution strategies assisted by gaussian processes with improved pre-selection criterion. Proc. of IEEE CEC 2003, pp. 692--699, 2003.Google Scholar
- Y. S. Ong, P. B. Nair, and A. J. Keane. Evolutionary optimization of computationally expensive problems via surrogate modeling. AIAA Journal, 41(4), pp. 687--696, 2003.Google ScholarCross Ref
- J. F. Rodriguez, J. E. Renaud, and L. T. Watson. Convergence of trust region augmented lagrangian methods using variable fidelity approximation data. Structural Optimization, 15(3-4), pp. 141--156, 1998.Google ScholarCross Ref
- Y. S. Ong, P. B. Nair, K. Y. Lum. Evolutionary algorithm with hermite radial basis function interpolants for computationally expensive adjoint solvers. Computational Optimization and Applications, 2006. (In Press) Google ScholarDigital Library
- Z. Zhou, Y. S. Ong, P. B. Nair, A. J. Keane, and K. Y. Lum. Combining Global and Local Surrogate Models to Accelerate Evolutionary Optimization. IEEE Transactions on Systems, Man and Cybernetics(SMC), part C, 37(1), pp. 66--76, Jan 2007. Google ScholarDigital Library
- Y. S. Ong and A. J. Keane. Meta-lamarckian learning in memetic algorithm. IEEE Transactions on Evolutionary Computation, 8(2), pp. 99--110, 2004. Google ScholarDigital Library
- Y. S. Ong, P. B. Nair, and A. J. Keane, and K. W. Wong. Surrogate-assisted evolutionary optimization frameworks for high-fidelity engineering design problems. Y. Jin, editor, Knowledge Incorporation in Evolutionary Computation, Studies in Fuzziness and Soft Computing, pp. 307--332. 2004.Google Scholar
- N. Alexandrov, J. E. Dennis, R. M. Lewis, and V. Torczon. A trust region framework for managing the use of approximation models in optimization. Journal on Structural Optimization, 15(1), pp. 16--23, 1998.Google ScholarCross Ref
- C. T. Lawrence and A. L. Tits. Nonlinear equality constraints in feasible sequential quadratic probramming. Optimization Methods and Software, 6, pp. 265--282, 1996.Google ScholarCross Ref
- D. J. C. Mackay. Introduction to gaussian processes. Neural Networks and Machine Learning, 168, pp. 133--165, 1998.Google Scholar
- C. Bishop. Neural networks for pattern recognition. Oxford University Press, 1995. Google ScholarDigital Library
- Fred H. Lesh. Multi-dimensional least-square polynomial curve fitting. Communications of ACM, 2(9), pp. 29--30, 1959. Google ScholarDigital Library
- G. B. Huang, Q. Y. Zhu, C. K. Siew. Extreme learning machine: a new learning scheme of feedforward neural networks. Proc. of IJCNN 2004, 2, pp. 985--990, 2004.Google Scholar
- S. D. Handoko, C. K. Kwoh, Y. S. Ong, L. Z. Guang and B. Vladimir. Extreme learning machine for predicting HLA-peptide binding. Third ISNN, Chengdu, May 28-June 1, LNCS 3973, pp. 716--721, 2006. Google ScholarDigital Library
- K. H. Liang, X. Yao, and C. Newton. Evolutionary search of approximated n-dimensional landscapes. International Journal of Knowledge-Based Intelligent Engineering Systems, vol. 4, no. 3, pp. 172--183, 2000.Google Scholar
- Y. S. Ong, Z. Zhou, and D. Lim. Curse and blessing of uncertainty in evolutionary algorithm using approximation. Proc. of The 2006 IEEE CEC 2006, pp. 2928--2935, Vancouver, July 2006.Google Scholar
- R. Jin, W. Chen and T. W. Simpson, 2001. Comparative studies of metamodeling techniques under multiple modeling criteria. Journal of Structural Optimization, Vol. 23, No. 1, pp. 1--13, 2001.Google ScholarCross Ref
- Z. Z. Zhou, Y. S. Ong, M. H. Lim and B. S. Lee. Memetic Algorithm using Multi-Surrogates for Computationally Expensive Optimization Problems. Soft Computing Journal, ISSN 1432--7643 (Print) 1433--7479 (Online), December 2006. Google ScholarDigital Library
- Y. Jin and B. Sendhoff. Reducing fitness evaluations using clustering techniques and neural networks ensembles. GECCO 2004, LNCS 3102, pp. 688--699, 2004.Google ScholarCross Ref
Index Terms
- A study on metamodeling techniques, ensembles, and multi-surrogates in evolutionary computation
Recommendations
Ecological theory provides insights about evolutionary computation
GECCO '18: Proceedings of the Genetic and Evolutionary Computation Conference CompanionPromoting diversity in an evolving population is important for Evolutionary Computation (EC) because it reduces premature convergence on suboptimal fitness peaks while still encouraging both exploration and exploitation [3]. However, some types of ...
Techniques for Maintaining Population Diversity in Classical and Agent-Based Multi-objective Evolutionary Algorithms
ICCS '07: Proceedings of the 7th international conference on Computational Science, Part IIThe loss of population diversity is one of the main problems in some applications of evolutionary algorithms. In order to maintain useful population diversity some special techniques must be used, like niching or co-evolutionary mechanisms. In this ...
Analysis of evolutionary multi-tasking as an island model
GECCO '18: Proceedings of the Genetic and Evolutionary Computation Conference CompanionRecently, an idea of evolutionary multi-tasking has been proposed and applied to various types of optimization problems. The basic idea of evolutionary multi-tasking is to simultaneously solve multiple optimization problems (i.e., tasks) in a ...
Comments