Metaheuristic Optimization Algorithms

Writter : 9tailwolf(Eungyeol Lee), doryeon514@gm.gist.ac.kr

Introduction

Metaheuristic optimization algorithms are a set of algorithms designed to efficiently solve complex optimization problems. Unlike traditional optimization techniques, these algorithms are useful when the structure of the problem is not clearly known or when it is computationally very expensive. It is especially useful when the form or constraints of the objective function for a given problem are complex, difficult, or impossible to solve.


Leaderboard

Open/Close
About how to evaluate algorithm, please check How to Evaluate Algorithm Section.


Contents

Topic Paper Code Score
Simulation Annealing Paper Code -
Tabu Search      
Genetic Algorithm      
Evolutionary Algorithm      
Cultural Algorithm      
Particle Swarm Optimization      
Differential Evaluation      
Local Search      
Variable neighborhood search      
Harmony Search      
Memetic Algorithm      
Ant Colony Optimization      
Glowworm Swarm Optimization      


How to Evaluate Algorithms

There are various ways to evaluate algorithms. The metaheuristic algorithm can be evaluated by applying it to various benchmark functions and comparing the results. At this time, the average, standard deviation, and highest value of the predicted value serve as evaluation criteria. Below is the standard benchmark functions for above leaderboard.

  • Alpine1, ($x \in R^{50}$, $-10 \leq x \leq 10$)
  • BartelsConn ($x \in R^{2}$, $-500 \leq x \leq 500$)
  • Bird ($x \in R^{2}$, $-2\pi \leq x \leq 2\pi$)
  • EggCrate ($x \in R^{2}$, $-5 \leq x \leq 5$)
  • Qing ($x \in R^{50}$, $-500 \leq x \leq 500$)
  • Rastrigin ($x \in R^{50}$, $-5.12 \leq x \leq 5.12$)
  • Schwefel_1_20 ($x \in R^{50}$, $-100 \leq x \leq 100$)
  • Schwefel_2_20 ($x \in R^{50}$, $-100 \leq x \leq 100$)
  • Sphere ($x \in R^{50}$, $-100 \leq x \leq 100$)
  • XinSheYang2 ($x \in R^{50}$, $-2\pi \leq x \leq 2\pi$)

In this evaluation, I evaluate two factor, How well does it optimize, and How many does it optimize. Above 10 functions are tested each 5000 times. And every optimal value is scored by the difference(I use square) from the global optimal value. Scores are normalized within a certain range, which is $ [0,10000] \rightarrow [0, 1]$.

The evaluation items for the scores are as follows and it applies equally to all functions.

  • Average value of score(50%)
  • Best value of score(50%)

The smaller the score, the better the model’s benchmark. Therefore, calculate the final score using the formula below.

\[benchmark = \Sigma_{f} benchmark_{f}\] \[= \Sigma_{f} (10 - 5Avg(score)_{f} - 5Best(score)_{f})\] \[= 100 - \Sigma^{10}_{i=1}(5 \times \frac{1}{5000}\Sigma^{5000}_{j=1}S_{ij} + 5\times \min S_{i})\]

where, $S \in [f \times 5000], 0 \leq S_{ij} \leq 1, \forall (i,j)$

Reference

  • 2005. J Dréo, et. al. Metaheuristics for Hard Optimization (link).
  • 2008. Christian Blum, Maria Jose Blesa Aguilera, Andrea Roli, Michael Sampels. Hybrid Metaheuristics: An Emerging Approach to Optimization (link).
  • 2009. El-Ghazali Talbi. Metaheuristics: from design to implementation (link).
  • 2011. Sean Luke. Essentials of Metaheuristics (link).
  • 2014. Xin-She Yang. Nature-Inspired Optimization Algorithms (link).
  • 2016. Ke-Lin Du and M. N. S. Swamy. Search and Optimization by Metaheuristics: Techniques and Algorithms Inspired by Nature (link).
  • 2018. Bastien Chopard, Marco Tomassini. An Introduction to Metaheuristics for Optimization (link).
  • 2018. Teofilo F Gonzalez. Handbook of Approximation Algorithms and Metaheuristics, Second Edition: Methologies and Traditional Applications, Volume 1 (link).
  • 2018. Teofilo F Gonzalez. Handbook of Approximation Algorithms and Metaheuristics, Second Edition: Methologies and Traditional Applications, Volume 2 (link).
  • 2019. Michel Gendreau and Jean-Yves Potvin. Handbook of Metaheuristics. Springer International Publishing (link).