3 1 A

3.1. selleck chemicals llc PSOParticle swarm optimization (PSO) [14], introduced by Kennedy (a social psychologist) and Eberhart (an electrical engineer) in 1995 as an optimization method, is inspired by the observation on behavior of flocking birds and schooling fish. With the simplicity and lessened computation loads, PSO has been widely applied to many research areas, such as clustering and classification, communication networks, and scheduling [15, 17�C19].In foraging, birds flock together and arrange themselves in specific shapes or formations by sharing their information about food sources. The movement of each particle will be influenced by the experiences of itself and the peers. In the process of optimization, each particle s of flock S is associated with a position, a velocity, and a fitness value.

A position, which is a vector in a search space, represents a potential solution to an optimization problem; a velocity, which is a vector, represents a change in the position; a fitness value, which is computed by the objective function, indicates how well the particle solves the problem. To find an approximate solution, each particle s determines its movement iteratively by learning from its own experience and communication with its neighbors. The mechanism of coordination is encapsulated by the velocity control over all particles at each iteration t of the algorithm. For each particle s, the velocity at iteration t + 1 (Vst+1) is updated with (10), where Pst denotes the solution found by (position of) particle s at iteration t, P-st denotes the best solution found by particle s until iteration t, and P^st denotes the best solution found by the neighbors of particle s.

The cognition learning rate (c1) and social learning rate (c2) are introduced to control the influence of individual experience and their neighbors’ experience, respectively. At the next iteration t + 1, the position of each particle is updated by (11). One hasVst+1=Vst+c1r1(P?st?Pst)+c2r2(P^st?Pst),(10)Pst+1=Pst+Vst+1.(11)For discrete optimization problems, Kennedy and Eberhart [20] also introduce a binary GSK-3 particle swarm optimization that changes the concept of velocity from adjustment of the position to the probability that determines whether a bit of a solution becomes one or zero. The velocity of each particle s at iteration t, Vst+1, is squashed in sigmoidal function as shown in (12); the position updating function is replaced by (13), where rand() is a random number drawn from the interval [0, 1]. One hasS(Vst+1)=11+e?(Vst+1),(12)Pst+1={1if??rand()

Leave a Reply

Your email address will not be published. Required fields are marked *

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>