ABSTRACT
In the standard particle swarm optimization (PSO), a new particle's position is generated using two main informant elements: the best position the particle has found so far and the best performer among its neighbors. In fully informed PSO, each particle is influenced by all the remaining ones in the swarm, or by a series of neighbors structured in static topologies (ring, square, or clusters). In this paper, we generalize and analyze the number of informants that take part in the calculation of new particles. Our aim is to discover if a quasi-optimal number of informants exists for a given problem. The experimental results seem to suggest that 6 to 8 informants could provide our PSO with higher chances of success in continuous optimization for well-known benchmarks.
- E. Alba, G. Luque, J. García-Nieto,G. Ordonez, and G. Leguizamón. MALLBA: A software library to design efficient optimisation algorithms. Int. Journal of Innovative Computing and Applications (IJICA), 1(1):74--85, 2007. Google ScholarDigital Library
- A. Auger and N. Hansen. A restart CMA evolution strategy with increasing population size. IEEE Congress on Evolutionary Computation, 2:1769--1776, 2005.Google ScholarCross Ref
- M. Clerc and J. Kennedy. The particle swarm - explosion, stability, and convergence in a multidimensional complex space. IEEE Transactions on Evolutionary Computation, 6(1):58 -- 73, Feb 2002. Google ScholarDigital Library
- R. Eberhart and Y. Shi. Comparing Inertia Weights and Constriction Factors in Particle Swarm Optimization. In Proceedings of the IEEE Congress on Evolutionary Computation CEC'00, volume 1, pages 84--88, La Jolla, CA, USA, 2000.Google ScholarCross Ref
- S. García, D. Molina, M. Lozano, andF. Herrera. A study on the use of non-parametric tests for analyzing the evolutionary algorithms' behaviour: a case study on the CEC'2005. Journal of Heuristics, 15(6):617--644, 2009. Google ScholarDigital Library
- J. Kennedy and R. C. Eberhart. Swarm Intelligence. Morgan Kaufmann Publishers, San Francisco, California, 2001. Google ScholarDigital Library
- J. Kennedy and R. Mendes. Population structure and particle swarm performance. In Proceedings of the Congress of Evolutionary Computation CEC'02, volume 2, pages 1671--1676, Washington, DC, USA, 2002. IEEE Computer Society. Google ScholarDigital Library
- R. Mendes, J. Kennedy, and J. Neves. The Fully Informed Particle Swarm: Simpler, Maybe Better. IEEE Transactions on Evolutionary Computation, 8(3):204 -- 210, June 2004. Google ScholarDigital Library
- A. S. Mohais, R. Mendes, C. Ward, andC. Posthoff. Neighborhood re-structuring in particle swarm optimization. In LNCS 3809. Proceedings of the 18th Australian Joint Conference on Artificial Intelligence, pages 776--785. Springer, 2005. Google ScholarDigital Library
- PSO-Central-Group. Standard PSO 2006, 2007, and 2011. Technical Report {online} http://www.particleswarm.info/, Particle Swarm Central, January 2011.Google Scholar
- D. J. Sheskin. Handbook of Parametric and Nonparametric Statistical Procedures. Chapman & Hall/CRC, 2007. Google ScholarDigital Library
- P. N. Suganthan, N. Hansen, J. J. Liang, K. Deb,Y.-P. Chen, A. Auger, and S. Tiwari. Problem Definitions and Evaluation Criteria for the CEC'05 Special Session on Real-Parameter Optimization. Technical Report KanGAL Report 2005005, Nanyang Technological University, Singapore and Kanpur, India, 2005.Google Scholar
- K. Tang, X. Li, P. N. Suganthan, Z. Yang, andT. Weise. Benchmark Functions for the CEC'10 Special Session and Competition on Large Scale Global Optimization. Technical Report Nature Inspired Computation and Applications Laboratory, USTC, Nanyang Technological University, anyang Technological University, China, 2010.Google Scholar
- K. Tang, X. Yao, P. N. Suganthan, C. MacNish,Y. P. Chen, C. M. Chen, and Z. Yang. Benchmark functions for the CEC'08 special session and competition on large scale global optimization. Technical report, Nature Inspired Computation and Applications Laboratory, USTC, China, November November 2007.Google Scholar
- D. Thain, T. Tannenbaum, and M. Livny. Distributed computing in practice: the condor experience. Concurrency - Practice and Experience, 17(2--4):323--356, 2005. Google ScholarDigital Library
- I. C. Trelea. The particle swarm optimization algorithm: convergence analysis and parameter selection. Inf. Process. Lett., 85:317--325, March 2003. Google ScholarDigital Library
Index Terms
- Empirical computation of the quasi-optimal number of informants in particle swarm optimization
Recommendations
An improved cooperative quantum-behaved particle swarm optimization
Particle swarm optimization (PSO) is a population-based stochastic optimization. Its parameters are easy to control, and it operates easily. But, the particle swarm optimization is a local convergence algorithm. Quantum-behaved particle swarm ...
An enhanced particle swarm optimization with levy flight for global optimization
Enhanced PSO with levy flight.Random walk of the particles.High convergence rate.Provides solution accuracy and robust. Hüseyin Haklı and Harun Uguz (2014) proposed a novel approach for global function optimization using particle swarm optimization with ...
Cellular particle swarm optimization
This paper proposes a cellular particle swarm optimization (CPSO), hybridizing cellular automata (CA) and particle swarm optimization (PSO) for function optimization. In the proposed CPSO, a mechanism of CA is integrated in the velocity update to modify ...
Comments