![]() Dunis, Forecasting foreign exchange rates with adaptive neural networks using radial-basis functions and Particle Swarm Optimization, European Journal of Operational Research. In this approach, the particles are divided into subgroups to dynamically adapt local and global solutions. Discrete PSO (DPSO): uses several approaches to improve the performance, for instance, the mixed search approach.Multi Objective PSO (MOPSO): uses the concept of Pareto dominance to determine the flight direction of a particle and it maintains previously found best vectors in a global repository that is later used by other particles to guide their own flight.Adaptive PSO (APSO): adds a random component to the inertia weight, applies fuzzy logic, uses a secondary PSO to find the optimal parameters of a primary PSO….Hybrid of Evolutionary Programming and PSO (EPSO): uses the tournament selection to move to new positions the worsts particles.Hybrid of Genetic Algorithm and PSO (GA-PSO): employs the major aspect of GA approach as the capability of breeding.Second, there are also adaptive PSO to improve performance by adjusting the hyperparameters. First, PSO is close to an evolutionary algorithm so we see hybrid versions to add evolutionary capabilities. Generally, they are motivated by two main reasons. The hyperparameter c2 allows defining the ability of the group to be influenced by the best global solution found over the iterations.Īlos, you can find many variants of the PSO algorithm. PSO traduction: the c1 hyperparameter allows defining the ability of the group to be influenced by the best personal solutions found over the iterations. Each species has an overall tendency to follow its instinct ( personal) and a tendency to focus on the group experience ( social). Let’s look at how these solutions are found by studying the coefficients c1 and c2 (also called acceleration coefficients).īedtime story: in wildlife, there are different bird species. The inertia weight w thus makes a balance between the exploration and the exploitation of the best solutions found so far. Note that it is recommended to avoid w >1 which can lead to a divergence of our particles. In other words, a low coefficient w facilitates the exploitation of the best solutions found so far while a high coefficient w facilitates the exploration around these solutions. We can then see that the lower the coefficient w, the stronger the convergence. To better appreciate the influence of this coefficient w (also called inertia weight), I invite you to visualize the 3 swarms of particles above. Cognitive acceleration and social acceleration are stochastically adjusted by the weights r1 and r2. ![]() PSO traduction: at each iteration, the acceleration is weighted by random terms. Then they will more or less want to follow their intuition and follow the group. But for the sake of understanding, I will use these terms in this article.īedtime story: each day, our emotionally driven birds can more or less get up on the wrong side of the bed. This assertion of a balance between exploration and exploitation does not make much sense unless both are defined in a measurable way and, moreover, such a balance is neither necessary nor sufficient from an efficiency point of view. The challenge of the remaining part of the article will be to determine the impact of these coefficients to find a good balance between exploration and exploitation. Exploration, on the other hand, is the ability of particles to evaluate the entire research space. Exploitation is the ability of particles to target the best solutions found so far. These coefficients control the levels of exploration and exploitation. Throughout this article, I will detail the mechanisms behind the Particle Swarm Optimization algorithm assuming as a metaphor a group of birds.Īs you might have noticed, I have not yet talked about the inertia, cognitive and social coefficients. ![]() For the same hyperparameters, PSO will work on a very wide variety of tasks, which makes it a very powerful and flexible algorithm. These parameters are very simple to understand and do not require advanced notions. Last but not least, there are very few hyperparameters. ![]() In other words, unlike traditional optimization methods, PSO does not require the problem to be differentiable. Moreover, it does not use the gradient of the problem being optimized. It is demonstrated that PSO can have better results in a faster, cheaper way compared with other methods. Sermpinis on foreign exchange rate forecasting. For my part, I really enjoyed the application of this algorithm in the article by G. Particle swarm optimization (PSO) has been successfully applied in many research and application areas. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |