Literature DB >> 35365749

Hybrid leader based optimization: a new stochastic optimization algorithm for solving optimization applications.

Mohammad Dehghani1, Pavel Trojovský2.   

Abstract

In this paper, a new optimization algorithm called hybrid leader-based optimization (HLBO) is introduced that is applicable in optimization challenges. The main idea of HLBO is to guide the algorithm population under the guidance of a hybrid leader. The stages of HLBO are modeled mathematically in two phases of exploration and exploitation. The efficiency of HLBO in optimization is tested by finding solutions to twenty-three standard benchmark functions of different types of unimodal and multimodal. The optimization results of unimodal functions indicate the high exploitation ability of HLBO in local search for better convergence to global optimal, while the optimization results of multimodal functions show the high exploration ability of HLBO in global search to accurately scan different areas of search space. In addition, the performance of HLBO on solving IEEE CEC 2017 benchmark functions including thirty objective functions is evaluated. The optimization results show the efficiency of HLBO in handling complex objective functions. The quality of the results obtained from HLBO is compared with the results of ten well-known algorithms. The simulation results show the superiority of HLBO in convergence to the global solution as well as the passage of optimally localized areas of the search space compared to ten competing algorithms. In addition, the implementation of HLBO on four engineering design issues demonstrates the applicability of HLBO in real-world problem solving.
© 2022. The Author(s).

Entities:  

Year:  2022        PMID: 35365749      PMCID: PMC8976018          DOI: 10.1038/s41598-022-09514-0

Source DB:  PubMed          Journal:  Sci Rep        ISSN: 2045-2322            Impact factor:   4.996


Introduction

Advances in science and technology have led to the emergence of new optimization challenges as well as the complexity of optimization problems. These cases indicate the need and importance of optimization with efficient tools to achieve optimal solutions. An optimization problem is identified and modeled with three main parts: decision variables, constraints, and objective function[1]. The goal in optimization is to achieve the best solution with respect to the constraints of the problem, among all solutions defined for an optimization problem[2]. Problem solving techniques in optimization applications fall into two groups of deterministic methods and stochastic methods. Deterministic methods using derivative information have acceptable performance in linear and convex spaces. However, these methods are incapable of dealing with high dimension and constraint problems, complex objective functions, nonlinear and non-convex spaces. Stochastic methods, by employing random operators and random scanning of the search space away from the difficulties of deterministic methods, have the ability to provide acceptable solutions to optimization problems. Simplicity in understanding, ease of implementation, no need for derivative information, the ability to cross local optimal areas, applicability in nonlinear, and non-convex spaces are some of the advantages that have led to the popularity and pervasiveness of random methods. Optimization algorithms are one of the most popular techniques in the stochastic approach to optimizing the problem[3]. How to achieve the solution in optimization algorithms begins with generating a certain number of candidate solutions (equal to the population of the algorithm). Evaluating the objective function of the problem based on these candidate solutions determines the quality of each solution. Using this information and the algorithm steps, these candidate solutions are improved in an iterative process. Once the algorithm is fully implemented, the best candidate solution that provides a better value for the objective function compared to other candidate solutions is identified. Given the fact that every optimization problem has a basic solution called global optimal, the point made in optimization studies is that optimization algorithms do not guarantee that they can achieve exactly the global optimal solution. Therefore, quasi-optimal is the name given to the solutions obtained from the optimization algorithms[4]. Efforts to reduce the differences between quasi-optimal solutions and global optimal solutions to find better solutions have paved the way for the design and development of numerous optimization algorithms. Exploration and exploitation are capabilities that enable optimization algorithms to be efficient in finding solutions. Exploration is the ability to search globally in different areas of the search space while exploitation is the ability to search locally near the solutions obtained because there may be better solutions near those solutions. Balancing exploration and exploitation play a key role in the success of optimization algorithms in achieving optimal solutions[5]. The main research question in the study of optimization algorithms is whether there is still a need to introduce new optimization algorithms despite the fact that countless algorithms have been introduced so far. The No Free Lunch (NFL) theorem[6] answers this question. The concept of the NFL theorem explains that there is no guarantee that an algorithm with optimal performance in solving a set of objective functions and problems will be able to perform the same performance in all optimization applications. It is not possible to ensure that a particular algorithm is the best optimizer in all optimization topics. The NFL theorem encourages researchers to develop new algorithms to find better solutions to optimization problems. The NFL theorem has motivated researchers in this paper to develop a new optimization algorithm for optimization applications. Innovation of this study is in introducing and designing a new evolutionary algorithm called Hybrid Leader Optimization (HLBO). The main contributions of this paper are as follows:In this section and in the following section, the related works are presented. The Hybrid Leader Optimization (HLBO) algorithm is introduced and modeled in the section “Hybrid Leader-Based Optimization”. Simulation studies are included in the section “Simulation Studies and Results”. The discussion of HLBO results is provided in the section “Results and Discussion”. HLBO performance test on IEEE CEC 2017 is presented in “Evaluation of the Effectiveness of HLBO in Handling Complex IEEE CEC 2017 Objective Functions”. Conclusions and several research subjects are provided for further study in the last section. A new stochastic-based optimization algorithm is presented, whose fundamental idea is to guide the population algorithm based on a hybrid leader generated by three different members. The stages of HLBO are described in two phases of exploration and exploitation and are mathematically modeled. The efficiency of HLBO has been benchmarked by optimizing twenty-three objective functions of a variety of unimodal and multimodal types. To evaluate the capability of HLBO, its performance has been compared with ten well-known algorithms.

Related works

Optimization algorithms are stochastic techniques to solve optimization applications that are based on the concepts of stochastic mechanisms, e.g., concretely on random methods of trial and error, modeling of natural processes, animal behavior, physical sciences, biology sciences, rules of games and other evolutionary processes[7]. The main idea applied in the design categorizes the optimization algorithms into five groups: evolutionary-based, swarm-based, physics-based, game-based, and human-based optimization algorithms. Evolutionary-based algorithms have been developed using the concept of natural selection, the concepts of biological and genetic sciences, and random operators such as selection, crossover, and mutation. Genetic Algorithm (GA)[8] and Differential Evolutionary (DE)[9] are the most significant evolutionary algorithms whose main inspiration is modeling of the reproductive process. Simulation of the human immune system against diseases has paved the way for the design of an Artificial Immune System (AIS) algorithm[10]. Swarm-based algorithms are inspired by the behaviors and strategies of animals, insects, birds, and other swarming activities in nature. The most widely used and famous techniques of this group are Particle Swarm Optimization (PSO)[11], Ant Colony Optimization (ACO)[12], Artificial Bee Colony (ABC)[13], Firefly Algorithm (FA)[14]. The strategy of birds and fish in finding food sources using individual and collective information has been the basic inspiration in designing PSO. The ACO’s main idea has been the ability of ant colonies to find the shortest path between the nest and the food source, taking advantage of its pheromone properties and accumulation. Utilizing the collective intelligence and smart behavior of the bee colony to search and find food has been the fundamental inspiration in ABC design. The light emitted by fireflies can be used for a variety of reasons, such as attracting prey and hunting, attracting other members of the group (attracting the opposite sex), and as a communication strategy. This fascinating light of fireflies has been a remarkable and interesting phenomenon, the inspiration of which has led to the development of the FA. Searching strategies and behaviors of animals, birds, and insects to find food sources or prey hunting have been the main ideas in the design of various techniques such as Grey Wolf Optimization (GWO) algorithm[15], Pelican Optimization Algorithm (POA)[16], Marine Predator Algorithm (MPA)[17], Orca Predation Algorithm (OPA)[18], Whale Optimization Algorithm (WOA)[19], which numerous efforts to improve it have led to “enhanced WOA” versions[20,21], Reptile Search Algorithm (RSA)[22], and Tunicate Search Algorithm (TSA)[23]. Other swarm-based algorithms include Hunger Games Search (HGS)[24], slime mould algorithm (SMA)[25]], Farmland Fertility[26], African Vultures Optimization Algorithm (AVOA)[27], Artificial Gorilla Troops Optimizer (GTO)[28], Butterfly Optimization Algorithm[29], Symbiotic Organisms Search (SOS)[30], Tree Seed Algorithm (TSA)[31], and Spotted Hyena Optimizer (SHO)[32]. Physics-based algorithms have been developed on the base of using some physical processes and modeling of physical forces and laws. Simulated Annealing (SA) is the name of the most familiar physics-based algorithm based on simulation of the cooling of a molten metal in the refrigeration process[33]. The use of gravity force along with Newton’s laws of motion have been the basic principles employed in Gravitational Search Algorithm (GSA) design[34]. Flow regimes and classical fluid mechanics have been a fundamental inspiration in developing Flow Regime Algorithm (FRA)[35]. Mathematical modeling of the nuclear reaction process in two stages of nuclear fusion and nuclear fission is employed in the design of Nuclear Reaction Optimization (NRO)[36]. The application of three concepts in cosmology, including wormholes, black holes, and white holes, has been the basis of the Multi-Verse Optimizer (MVO) design[37]. Game-based algorithms are inspired by player behaviors, rules governing individual and group games. The strategy used by different players to put the puzzle pieces together and solve it has been the idea of designing the Puzzle Optimization Algorithm (POA)[38]. Simulation of the coaching process, holding competitions, and teams interacting with each other during a competitive season of volleyball has led to the design of the Volleyball Premier League (VPL) optimization method[39]. Mathematizing the competition between teams and groups playing a tug-of-war game and trying to win has been the main idea in the development of Tug of War Optimization (TWO) approach[40]. Human-based algorithms are developed based on the simulation of human activities and behaviors in performing various tasks. Among the approaches of this group can be mentioned Teaching-Learning-Based Optimization (TLBO) based on modeling the interactions of a teacher and learners in the classroom[41], Poor and Rich Optimization (PRO) based on the modeling of the efforts of the rich and poor groups to improve their economic situation[42], and Human Behavior-Based Optimization (HBBO) based on the modeling of human thoughts and behaviors[43]. Scientists’ research in optimization studies also includes improving existing algorithms[44-47], extending hybrid algorithms by combining different algorithms to increase their efficiency[48,49], and developing binary versions of optimization algorithms[50-53].

Hybrid leader-based optimization

In this section, the concepts of the proposed Hybrid Leader-Based Optimization (HLBO) approach are stated and the HLBO mathematical formulation is presented.

Inspiration and main idea of HLBO

In population-based algorithms, each member of the population is a searcher in the problem-solving space and therefore a candidate solution. Based on the algorithm steps and information transfer, the population members are able to improve their position to provide better solutions. The dependence of the algorithm population update process on specific members (such as the best member of the population and the worst member of the population) may prevent the algorithm from searching globally in the problem-solving space. These conditions can lead to the rapid convergence of the algorithm towards the local optimal solution and as a result, the algorithm fails to identify the main optimal area in the search space. Therefore, overreliance on the process of updating the algorithm population to certain members reduces the exploration ability within the algorithm. In the proposed HLBO method, a unique hybrid leader is employed to update and guide each member of the algorithm population in the search space. This hybrid leader is generated based on three different members including the best member, one random member, and the corresponding member.

Mathematical model of HLBO

The HLBO population is similar to other population-based algorithms that can be mathematically modeled using a matrix according to Eq. (1).where X is the HLBO population, is the ith candidate solution, is the value of jth variable determined by the ith candidate solution, N is the size of HLBO population, and m is the number of problem variables. The position of each member , , of the population X is initially initialized randomly by considering the constraints of the problem variables based on Eq. (2).where r is a random real number from the interval [0, 1], and are the lower and upper bound of the jth problem variables respectively. The objective function of the problem is evaluated based on each of the candidate solutions determined by the members of the population X, which is specified in Eq. (3) using a vector.where represents the vector of the objective functions and denotes the objective function value delivered from the ith candidate solution. The values obtained for the objective function are a measure of the quality of the candidate solutions. The member that provides the best value for the objective function is known as the best member and the member that provides the worst value for the objective function is known as the worst member . These values are updated in each algorithm iteration. What distinguishes optimization algorithms from each other is the process used to update the algorithm population. Two important and influential indicators in the performance of optimization algorithms that should be considered in the process of updating and changing the position in the search space are exploration (global search) and exploitation (local search).

Phase 1: Exploration (global search)

Exploration is a feature that enables members of the algorithm population to accurately scan different areas of the search space to be able to find the original optimal area. Excessive reliance on specific members of the population (such as the best member) in the process of updating the algorithm population position prevents the global search of the algorithm in the search space and reduces the algorithm’s ability to explore. This dependence in the update process can lead to early convergence of the algorithm to the local optimal and as a result the algorithm fails to identify the main optimal area in the search space. However, some population members, like the best member, have useful information that should not be overlooked. HLBO uses a hybrid leader to update members of the population. This hybrid leader is produced for each member of the population at each repetition. In constructing a random leader, three members of the population, including The participation coefficient of each of these three members in the production of the hybrid leader is based on the quality of that member in providing a better value for the objective function. The quality of each member of the population in presenting the candidate solution is calculated using Eq. (4).Then, using the results of Eq. (4), the participation coefficients for each member are calculated using Eq. (5).where , , is the quality of the ith candidate solution, is the value of the objective function of the worst candidate solution, , , are the participation coefficients of the ith member, the best member, and the kth member (k is an integer determined randomly from the set ), respectively, in producing the hybrid leader. the corresponding member (the same member to be led by this hybrid leader), the best member, a random member of the population is influential. After determining the participation coefficients, the hybrid leader is generated for each member of the population using Eq. (6).where is the hybrid leader for the ith member and is a randomly selected population member which the index k is the row number of this member in the population matrix. The new position for each member of the population in the search space under the guidance of the hybrid leader is calculated using Eq. (7). This new position is acceptable to the corresponding member if the value of the objective function is improved from the previous position, otherwise it remains in the previous position. These update conditions are modeled in Eq. (8).where is the new position of the ith member, is its jth dimension, is its objective function value based on the first phase of HLBO, r is a random real number from the interval [0, 1], I is an integer which is selected randomly from the set , and is the value of the objective function obtained from hybrid leader of the ith member.

Phase 2: Exploitation (local search)

Exploitation is an ability for members of the algorithm population that enables them to search locally for finding better solutions near the obtained solutions. Therefore, in HLBO a neighborhood around each member of the population is considered that allows that member to change position by searching locally in that area and finding a position with a better value for the objective function. This local search is modeled to improve and increase HLBO exploitation ability using Eq. (9). In this phase, the newly calculated position is also acceptable if it improves the value of the objective function, which is simulated in Eq. (10).where is the new position of the ith member, is its jth dimension, is its objective function value based on the second phase of HLBO, R is the constant equal to 0.2, t is the iteration counter, and T is the maximum number of iterations.

Repetition process, pseudo-code, and flowchart of HLBO

By implementing the first and second phases, all HLBO members are updated and an iteration of the algorithm is completed. The algorithm enters the next iteration and the HLBO population update process continues based on the exploration and exploitation phases according to Eqs. (4)–(10). This process continues until the end of the algorithm, and finally the best candidate solution experienced during the iterations is introduced as the solution to the problem. The HLBO pseudocode is presented in Algorithm 1 and its flowchart is presented in Fig. 1.
Figure 1

Convergence curves of the HLBO and competitor algorithms in optimizing objective functions to .

Convergence curves of the HLBO and competitor algorithms in optimizing objective functions to .

Computational complexity of HLBO

The HLBO initialization and preparation process has a computational complexity equal to , where N refers to the number of population members and m is the number of variables in the problem. In each iteration, for each member, a hybrid leader must be generated, resulting in the computational complexity of generating the hybrid leaders equal to , where T is the maximum number of iterations of the algorithm. The HLBO update process has two phases of exploration and exploitation, which in both phases the objective function is evaluated. As a result, the computational complexity of HLBO update process equals . Thus, the total computational complexity of HLBO is equal to .

Simulation studies and results

This section is devoted to simulation studies and evaluation of the proposed HLBO performance in optimization. HLBO has been implemented to provide optimal solutions of twenty-three standard benchmark functions of three main types (complete definitions, domains, and tables of suitable values of parameters of functions to can be found in the paper[54]) unimodal function (functions to ), high-dimensional multimodal functions (functions to ), and fixed-dimensional multimodal functions (functions to ). The optimization results obtained from HLBO are compared with the performance of ten well-known algorithms including PSO, MPA, HGS, SMA, GA, WOA, TLBO, TSA, GSA, and GWO. The HLBO and the ten mentioned algorithms in twenty independent implementations are employed in optimizing the benchmark functions while each iteration contains 1000 iterations. The optimization results are reported using four statistical indicators: mean, best, standard deviation, and median. Moreover, the rank of each algorithm in providing a better solution for each benchmark function as well as for each group of objective functions is specified. Table 1 lists the adjusted values of the control parameters of the ten competitor algorithms.
Table 1

Adjusted values of the control parameters of ten competitor algorithms.

AlgorithmParameterValue
HGSRanging controllerR is gradually reduced to 0
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$R=2\cdot shrink \cdot rand - shrink$$\end{document}R=2·shrink·rand-shrink
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$shrink=2(1-t/T)$$\end{document}shrink=2(1-t/T)
SMARandom parameter\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$Z = 0.03$$\end{document}Z=0.03
MPABinary vector\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$U = 0$$\end{document}U=0 or \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$U = 1$$\end{document}U=1
Random vectorR is a vector of uniform random numbers in [0, 1]
Constant number\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$P = 0.5$$\end{document}P=0.5
Fish Aggregating Devices\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$FADs=2$$\end{document}FADs=2
TSA\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$c_1, c_2, c_3$$\end{document}c1,c2,c3Random numbers, which lie in the interval [0, 1]
Pmin1
Pmax4
WOAl is a random number in \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$[-1,1]$$\end{document}[-1,1]
r is a random vector in [0, 1]
Convergence parameter aa: Linear reduction from 2 to 0
GWOConvergence parameter aa: Linear reduction from 2 to 0
TLBOrandom numberrand is a random number from the interval [0, 1]
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$T_F$$\end{document}TF: teaching factor\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$T_F = \text {round}\, (1+rand)$$\end{document}TF=round(1+rand)
GSAAlpha20
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$G_0$$\end{document}G0100
Rnorm2
Rnorm1
PSOVelocity limit10% of dimension range
TopologyFully connected
Inertia weightLinear reduction from 0.9 to 0.1
Cognitive and social constant\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$(C_1,C_2 )=(2,2)$$\end{document}(C1,C2)=(2,2)
GATypeReal coded
MutationGaussian (\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$Probability = 0.05$$\end{document}Probability=0.05)
CrossoverWhole arithmetic (\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$Probability = 0.8$$\end{document}Probability=0.8)
SelectionRoulette wheel (proportionate)
Adjusted values of the control parameters of ten competitor algorithms.

Evaluation of unimodal benchmark functions

The results of optimization of to benchmark functions using HLBO and competitor algorithms are released in Table 2. Experimental results show that HLBO provides the global optimal for and . HLBO is the best optimizer against competitor algorithms in optimizing , , and . HLBO ranks as the second in optimization and the third in optimization. What can be deduced from the analysis of the reported results is that HLBO is highly efficient in addressing unimodal optimization problems compared to ten competitor algorithms.
Table 2

Evaluation results of unimodal functions.

GAPSOGSATLBOGWOWOATSAMPASMAHGSHLBO
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$F_1$$\end{document}F1Mean13.3911.8E−52.0E−171.3E−591.1E−581.8E−648.2E−331.7E−18000
Best6.9052E−108.2E−189.4E−617.7E−611.3E−651.1E−623.4E−28000
Std5.5535.9E−57.1E−182.0E−594.1E−582.8E−642.5E−326.8E−18000
Med11.0459.92E−71.8E−174.7E−601.1E−596.3E−653.9E−381.3E−19000
Rank98734256111
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$F_2$$\end{document}F2Mean2.4800.34122.4E−85.6E−351.3E−341.6E−515.0E−392.8E−91.3E−1938.2E−1699.3E−222
Best1.5910.00171.59E−81.3E−351.55E−351.1E−578.3E−434.2E−18002.3E−223
Std0.64280.66964.0E−94.7E−352.2E−345.9E−511.7E−381.1E−8000
Med2.4640.13012.3E−84.4E−356.4E−351.9E−548.3E−413.2E−116.3E−19302.1E−222
Rank1110967458231
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$F_3$$\end{document}F3Mean1537.012589.508279.3587.0E−157.4E−157.6E−93.2E−190.377001.0E−1433.1E−167
Best1014.6891.61581.9121.2E−164.7E−203.4E−97.3E−300.0320006.4E−197
Std367.24291524.01112.2991.3E−141.9E−142.4E−99.9E−190.201804.5E−1430
Med1510.71554.154291.5321.9E−151.6E−167.2E−99.8E−210.3787001.9E−181
Rank1110956748132
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$F_4$$\end{document}F4Mean2.0943.9643.3E−91.6E−151.3E−140.00132.0E−223.7E−83.1E−2002.4E−1354.8E−206
Best1.3901.6052.1E−96.4E−163.4E−165.9E−51.9E−523.4E−17009.4E−208
Std0.3372.2047.5E−107.1E−162.3E−140.00066.0E−226.5E−801.1E−1340
Med2.0983.2623.4E−91.5E−157.3E−150.00143.1E−273.0E−82.2E−25408.7E−207
Rank1011756948231
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$F_5$$\end{document}F5Mean310.45250.26636.109145.67526.86327.17728.77042.5005.64918.01826.282
Best160.5013.647125.838120.79325.23026.45128.53841.5870.00027.2E−524.770
Std120.46736.52532.46219.7370.8820.6265740.3650.616911.10410.6730.956
Med279.51728.70326.075142.94426.71826.93528.54942.4910.234023.91826.538
Rank1197104568123
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$F_6$$\end{document}F6Mean14.55120.251800.450.6420.0713.8E−200.39090.00103.2E−70
Best6.0045001.6E−50.01466.7E−260.27460.00056.3E−100
Std5.83512.776000.51040.30120.07821.5E−190.08030.00034.1E−70
Med13.519000.62150.02936.7E−210.40660.00102.3E−70
Rank910178526431
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$F_7$$\end{document}F7Mean0.00570.11340.0206940.00310.0008190.00190.00030.00220.00010.00020.0001
Best0.00210.02960.010060.00140.0002484.2E−50.00010.00143.0E−52.2E−62.4E−5
Std0.00240.04590.0113630.00140.0005030.00330.00010.00059.2E−50.00037.4E−5
Med0.00540.10790.0169950.00290.0006290.00100.00040.00220.00618.7E−50.0001
Rank9111085647231
Sum rank7069504440383051131810
Mean rank109.85717.14296.28575.42865.42864.28577.28571.85712.57141.4286
Total rank118876549231

Evaluation of high-dimensional multimodal benchmark functions

The employment results of HLBO and ten competitor algorithms in optimizing to benchmark high-dimensional multimodal functions are reported in Table 3. HLBO has managed to find the global optimum by optimizing the functions and . HLBO is the first best optimizer for handling the function . In the case of the functions and the algorithm HGS is the first best optimizer, respectively, while HLBO is the fourth best optimizer for these functions. Analysis of simulation results shows HLBO capability in solving high-dimensional multimodal optimization problems.
Table 3

Evaluation results of high-dimensional multimodal functions.

GAPSOGSATLBOGWOWOATSAMPASMAHGSHLBO
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$F_8$$\end{document}F8Mean− 8184.3− 6908.6− 2849.0− 7803.5− 5885.1− 7687.5− 5669.6− 3652.1− 12569.3− 12569.1− 8246.4
Best− 9717.68− 8501.4− 3969.23− 9103.77− 7227.05− 8597.11− 5706.3− 4419.9− 12569.5− 12569.5− 8763.3
Std795.15836.7540.36986.61984.501105.1621.86474.580.39730.6999300.44
Med− 8117.25− 7098.95− 2671.33− 7735.22− 5774.63− 8290.68− 5669.63− 3632.65− 12569.4− 12569.4− 8306.6
Rank4711586910123
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$F_9$$\end{document}F9Mean62.41657.06516.26910.6788.5E−1500.0059152.703000
Best36.86627.8594.9759.874000.0048128.2306000
Std15.21616.5174.6600.3972.0E−1400.000715.1857000
Med61.67955.22515.42210.888000.0059154.621000
Rank76542138111
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$F_{10}$$\end{document}F10Mean3.22202.1548113.6E−90.26321.7E−143.9E−156.4E−118.3E−105.1E−152.9E−151.9E−15
Best2.75721.1551512.6E−90.15641.5E−148.9E−168.1E−151.7E−188.9E−168.9E−168.9E−16
Std0.36170.5493895.3E−100.07283.2E−152.6E−152.6E−102.8E−91.5E−141.7E−141.7E−15
Med3.12032.1700833.6E−90.26151.5E−144.4E−151.1E−131.1E−118.9E−168.9E−168.9E−16
Rank1110895367421
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$F_{11}$$\end{document}F11Mean1.23030.04633.73780.58770.00380.00301.6E−60000
Best1.14137.3E−91.51930.3101004.2E−150000
Std0.06280.05181.67030.16910.00730.01353.4E−60000
Med1.22720.02953.42430.5820008.8E−70000
Rank75864321111
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$F_{12}$$\end{document}F12Mean0.04700.48070.03630.02060.03720.00780.05020.08260.00116.7E−90.0114
Best0.01840.00015.6E−200.00200.01930.00110.03540.07791.9E−54.7E−100.0036
Std0.02850.60270.06090.02860.01390.00900.00990.00240.00226.4E−90.0049
Med0.04180.15561.5E−190.01520.03300.00390.05090.08210.01100.00040.0110
Rank8116573910214
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$F_{13}$$\end{document}F13Mean1.20860.50840.00210.32910.57640.19332.65890.56530.00071.3E−70.1840
Best0.498110.0E−71.2E−180.03830.29780.02972.63180.28039.8E−60.13620.1362
Std0.33371.25170.00550.19890.17040.15090.00990.18780.00042.0E−70.0256
Med1.21810.04402.1E−180.28300.57830.15202.66180.57990.00076.4E−80.1792
Rank1073695118214
Sum rank474641353521404411814
Mean rank7.83337.66676.83335.83335.83333.56.66677.33331.83331.33332.3333
Total rank109755468213

Evaluation of fixed-dimensional multimodal benchmark functions

The results of implementing HLBO and competitor algorithms on benchmark to benchmark functions are presented in Table 4. What is evident from the simulation results is that HLBO is the first best optimizer in solving to benchmark functions compared to competitor algorithms. The presented experimental results show that HLBO has a superior performance over similar algorithms in dealing with multimodal optimization problems.
Table 4

Evaluation results of fixed-dimensional multimodal functions.

GAPSOGSATLBOGWOWOATSAMPASMAHGSHLBO
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$F_{14}$$\end{document}F14Mean0.99872.17373.59172.26443.74103.10621.79880.99880.99811.97460.998
Best0.99800.99800.99950.99840.99800.99800.99790.99810.99800.99800.998
Std0.00252.93652.77921.14963.96973.53360.52750.00030.00023.00560
Med0.99800.99802.98672.27522.982170.99841.91260.99890.99800.99800.998
Rank3710811954261
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$F_{15}$$\end{document}F15Mean0.00540.00170.00240.00320.00640.00070.00040.00390.00050.00060.0003
Best0.00080.00030.00080.00220.00030.00030.00030.00030.00030.00030.0003
Std0.00810.00490.00120.00040.00940.00037.6E−50.00510.00030.00034.3E−13
Med0.00210.00030.00230.00320.00030.00050.00040.00270.00040.00070.0003
Rank1067811529341
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$F_{16}$$\end{document}F16Mean− 1.0316− 1.0316− 1.0316− 1.0316− 1.0316− 1.0316− 1.0316− 1.0316− 1.0316− 1.0316− 1.0316
Best− 1.0316− 1.0316− 1.0316− 1.0316− 1.0316− 1.0316− 1.0316− 1.0316− 1.0316− 1.0316− 1.0316
Std4.4E−53.2E−53.2E−53.18E−53.2E−53.2E−53.5E−54.1E−53.2E−53.2E−52.2E−16
Med− 1.0316− 1.0316− 1.0316− 1.0316− 1.0316− 1.0316− 1.0316− 1.0316− 1.0316− 1.0316− 1.0316
Rank32222244221
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$F_{17}$$\end{document}F17Mean0.43700.78550.39790.39790.39790.39790.40010.40140.39790.39790.3979
Best0.39790.39790.39790.39790.39790.39790.39810.39890.39790.39790.3979
Std0.14070.72178.6E−58.6E−58.6E−58.6E−50.00450.00448.6E−58.6E−50
Med0.39790.39800.39790.39790.39790.39790.39910.39890.39790.39790.3979
Rank67223345221
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$F_{18}$$\end{document}F18Mean4.36053.00023.00023.00023.00023.00023.00183.00023.00023.3.
Best3.3.3.3.3.3.3.3.3.3.3
Std6.03990.00060.00060.00060.00060.00060.00090.00060.00060.00061.1E−15
Med3.00113.3.3.3.00003.00003.00183.3.3.3.
Rank62224352221
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$F_{19}$$\end{document}F19Mean− 3.8543− 3.8627− 3.8627− 3.8613− 3.8621− 3.8606− 3.8066− 3.8627− 3.8627− 3.8627− 3.8628
Best− 3.8628− 3.8628− 3.8628− 3.8625− 3.8628− 3.8628− 3.8366− 3.8627− 3.8627− 3.8627− 3.8628
Std0.01480.00010.00010.00140.00170.00290.01530.00020.00020.00021.4E−7
Med− 3.8624− 3.8628− 3.8628− 3.862− 3.8628− 3.8617− 3.8066− 3.8627− 3.8627− 3.8627− 3.8628
Rank72254683221
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$F_{20}$$\end{document}F20Mean− 2.8239− 3.2619− 3.3220− 3.2011− 3.2523− 3.2229− 3.3195− 3.3211− 3.2386− 3.2804− 3.3220
Best− 3.3134− 3.322− 3.322− 3.2617− 3.3220− 3.3220− 3.3212− 3.3213− 3.3220− 3.3220− 3.3220
Std0.38600.07060.00010.03180.07660.09040.00310.00010.05600.05822.8E−5
Med− 2.9683− 3.3217− 3.322− 3.2076− 3.2623− 3.1952− 3.3206− 3.3211− 3.2031− 3.3220− 3.3220
Rank1162107943851
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$F_{21}$$\end{document}F21Mean− 4.6040− 5.5392− 5.4486− 9.1901− 9.4451− 8.8763− 5.5020− 9.9043− 10.153− 9.8982− 10.1532
Best− 8.5213− 10.1532− 10.1532− 9.6639− 10.1532− 10.1531− 9.5021− 10.1532− 10.1532− 10.1532− 10.1532
Std1.92473.07633.09400.12071.73952.26351.25660.55920.00041.14004.2E−10
Med− 4.3747− 5.1008− 3.7693− 9.1532− 10.1525− 10.1512− 5.5021− 10.1532− 10.1531− 10.1532− 10.1532
Rank1181065793241
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$F_{22}$$\end{document}F22Mean− 5.1174− 7.6322− 9.7664− 10.0486− 10.4024− 9.3372− 5.9134− 10.2858− 10.4027− 10.4028− 10.4029
Best− 9.1106− 10.4029− 10.4029− 10.4029− 10.4028− 10.4028− 9.0625− 10.4029− 10.4029− 10.4029− 10.4029
Std1.96963.54171.70840.39830.00042.18001.75490.24540.00030.00031.9E−5
Med− 5.0294− 10.4024− 10.4029− 10.1836− 10.4025− 10.4012− 5.0628− 10.4028− 10.4028− 10.4029− 10.4029
Rank1197648105321
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$F_{23}$$\end{document}F23Mean− 6.5621− 6.1647− 10.0188− 9.2642− 10.1302− 9.4522− 9.8098− 10.1408− 10.5362− 10.2659− 10.5364
Best− 10.2216− 10.5364− 10.5364− 10.534− 10.5363− 10.5363− 10.3683− 10.5364− 10.5364− 10.5364− 10.5364
Std2.61723.73491.59381.67651.81442.22191.60641.14010.00031.20945.4E−6
Med− 6.5629− 4.5055− 10.5364− 9.6717− 10.536− 10.535− 10.3613− 10.5364− 10.5363− 10.5364− 10.5364
Rank1011695874231
Sum rank7860505856605842283210
Mean rank7.8655.85.665.84.22.83.21
Total rank98576874231
The behavior of convergence curves of HLBO and competitor algorithms in achieving solutions for objective functions to is presented in Fig. 2.
Figure 2

Convergence curves of the HLBO and competitor algorithms in optimizing objective functions to .

Statistical analysis

In this subsection, by using statistical analysis on the obtained optimization results, the superiority of HLBO over competitor algorithms is examined from a statistical point of view to determine whether this superiority is significant or not. Wilcoxon sum rank test[55] is employed to address this goal. In this test, an index called p-value indicates and determines the superiority of the target algorithm over the competitor alternative algorithm. The Wilcoxon simulation results are released in Table 5. What can be deduced from the simulation findings is that HLBO has a significant statistical superiority over the competitor algorithm in cases with p-values less than 0.05.
Table 5

p-values results from Wilcoxon sum rank test.

Compared algorithmsTest function type
UnimodalHigh-multimodalFixed-multimodal
HLBO versus SMA0.0003300831.63352E−120.432505732
HLBO versus HGS3.09811E−61.63352E−129.93431E−30
HLBO versus MPA2.73907E−220.0003311724.9571E−34
HLBO versus TSA3.24444E−76.60407E−103.6466E−34
HLBO versus WOA2.03592E−70.0161395561.43615E−34
HLBO versus GWO0.0001210560.8449197731.43615E−34
HLBO versus TLBO4.24099E−220.0010918241.49171E−25
HLBO versus GSA3.3289E−86.10009E−80.001848864
WOA versus PSO9.54457E−198.18756E−105.68697E−9
WOA versus GA1.03289E−242.05428E−61.43615E−34

Evaluation of the effectiveness of HLBO in handling complex IEEE CEC 2017 objective functions

In the previous section, the performance of HLBO in handling the objective and multimodal target functions was examined, indicating the satisfactory results of the proposed approach. In this section, the effectiveness of HLBO in solving complex IEEE CEC 2017 benchmark functions[56] is evaluated. The implementation results of HLBO as well as ten competitor algorithms on objective functions to are presented in Tables 6 and 7. What emerges from the simulation results is that HLBO ranks first in optimizing , , , , to , , , , , and functions by providing the best performance compared to competitor algorithms. The general analysis of the simulation results of to functions shows that HLBO has an acceptable efficiency in handling IEEE CEC 2017 objective functions.
Table 6

Evaluation results of IEEE CEC 2017 objective functions to .

GAPSOGSATLBOGWOWOATSAMPASMAHGSHLBO
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$C_{1}$$\end{document}C1Avg980039602962.0E+73.3E+58.5E+629634001562470100
Std621046602874.2E+61.1E+52.4E+728738403.8E+4277501
Rank763108935241
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$C_{2}$$\end{document}C2Avg5610706079101.2E+4314461216219201201200
Std43602290226066607510738079770177.99910.9
Rank789105634221
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$C_{3}$$\end{document}C3Avg87203001.1E+42.8E+415402.3E+41.1E+43003011510300
Std61702.0E−1016909240198039001670050.126.51.0E−10
Rank51684761231
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$C_{4}$$\end{document}C4Avg4114064075484102390407406403404400
Std19.33.433.0515.97.894313.0510.6998.540.0596
Rank74586954231
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$C_{5}$$\end{document}C5Avg516513557742514900557522530513510
Std7.246.838.7836.96.3783.18.791160.925.44.13
Rank42783975621
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$C_{6}$$\end{document}C6Avg600600622665601691622610682600600
Std0.6981.029.4343.90.09211.49.438.63371.466.4E−4
Rank11452743611
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$C_{7}$$\end{document}C7Avg72871971512807301860715741713713723
Std7.625.331.6244.18.9979.81.6317.31.74.494.09
Rank53286927114
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$C_{8}$$\end{document}C8Avg8218118219528141070821823829809809
Std9.365.724.919.98.6346.54.910.455.48.373.26
Rank53584956712
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$C_{9}$$\end{document}C9Avg91090090068009112.9E+49009444670910900
Std15.96.2E−146.2E−15136020.491300110215020.90.017
Rank21163714521
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$C_{10}$$\end{document}C10Avg17201470269052901530747026901860259014101440
Std263225311674300142031130843336.6147
Rank538941086712
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$C_{11}$$\end{document}C11Avg11301110113012701140192011301180111011101100
Std24.96.561141.656.519801162.526.511.71.33
Rank32364735221
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$C_{12}$$\end{document}C12Avg3.7E+41.4E+47.0E+52.2E+76.3E+51.8E+87.1E+52.0E+616301.5E+41250
Std3.6E+41.2E+44.4E+42.2E+71.2E+61.8E+94.4E+52.0E+6207280056.7
Rank5371061189241
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$C_{13}$$\end{document}C13Avg1.1E+486001.1E+44.2E+598401.9E+81.1E+41.61E+4132068201310
Std9330535022001.3E+558801.4E+822001.1E+481.844502.7
Rank647951078231
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$C_{14}$$\end{document}C14Avg7050148071504.1E+534002.0E+671501510145014501400
Std853044.415602.4E+520407.3E+6156053.458.523.44.24
Rank63785974221
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$C_{15}$$\end{document}C15Avg930017101.8E+44.8E+438101.4E+71.8E+42240151015801500
Std938029657501.6E+440302.1E+7575059717.11340.543
Rank748961085231
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$C_{16}$$\end{document}C16Avg17901860215035001730300021501730182017301600
Std13513411123913012501111332401251.03
Rank35682762421
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$C_{17}$$\end{document}C17Avg17501760186026301760434018601770183017301710
Std41.649.611319932.733111335.718436.19.86
Rank34784975621
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$C_{18}$$\end{document}C18Avg1.6E+41.5E+487207.5E+52.6E+43.7E+787202.3E+4183074401800
Std1.3E+41.2E+452903.9E+51.6E+45.2E+752901.5E+414.147200.543
Rank654981047231
Table 7

Evaluation results of IEEE CEC 2017 objective functions to .

GAPSOGSATLBOGWOWOATSAMPASMAHGSHLBO
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$C_{19}$$\end{document}C19Avg969026001.4E+46.1E+598702.3E+64.5E+42920192019501900
Std707022902.0E+45.7E+566601.7E+72.0E+419503057.80.47
Rank6481071195231
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$C_{20}$$\end{document}C20Avg20602090227028702080379022702090249020202020
Std62.765.185.421354.346285.451.525426.410.1
Rank35684965721
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$C_{21}$$\end{document}C21Avg23002280236025802320258023602250232022302200
Std45.856.429.564.57.3219229.563.170.945.521.3
Rank54786873621
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$C_{22}$$\end{document}C22Avg230023102300718023101.4E+423002310353022802280
Std2.4969.10.0752134017.610800.073212.388613.939.8
Rank34364733512
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$C_{23}$$\end{document}C23Avg26302620274031202620381027402620273026102610
Std149.6540.986.88.852294.99.082544.314.11
Rank43673863512
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$C_{24}$$\end{document}C24Avg27602690274033302740348027402730270026202520
Std15.61315.771699.122295.867.376.783.240.1
Rank73686965421
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$C_{25}$$\end{document}C25Avg29502920294029102940391029402920293029202900
Std20.226.116.118.424.7266162521.813.20.512
Rank63525753431
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$C_{26}$$\end{document}C26Avg311029503.4E+478703230710034402900346031102850
Std350261657951297065738.262630223.4101
Rank431095862741
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$C_{27}$$\end{document}C27Avg31203120326034103100481032603090314031103090
Std20.126.143.685.822.864243.62.9122.421.80.481
Rank55783972641
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$C_{28}$$\end{document}C28Avg33203320346034003390509034603210340023003100
Std13212735.312410732935.31181371306.6E−5
Rank44765873612
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$C_{29}$$\end{document}C29Avg32503200345045603190889034503210321032103150
Std85.784.717951644.814801795411559.1134
Rank53672864441
\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$C_{30}$$\end{document}C30Avg5.4E+53.5E+51.3E+64.0E+63.0E+51.9E+79.4E+54.2E+53.1E+53.0E+53410
Std6.7E+55.3E+53.8E+51.7E+65.5E+51.4E+83.8E+55.9E+54.7E+52.2E+428
Rank7591031186421
Sum rank1461081792361382581701341227038
Mean rank0.18560.14220.22890.31330.18440.33890.24110.17670.15780.09780.0489
Total rank7391061185421
Evaluation results of unimodal functions. Evaluation results of high-dimensional multimodal functions. Evaluation results of fixed-dimensional multimodal functions. Convergence curves of the HLBO and competitor algorithms in optimizing objective functions to .

Results and discussion

Optimization algorithms by utilizing exploration for global search and exploitation for local search, have the ability to handle optimization problems. To analyze the exploitation ability of HLBO in local search, the unimodal objective functions are favorable with only one main peak. In this type of optimization issues, the main challenge is the convergence towards the global optima. The optimization results of unimodal functions using HLBO indicate the exploitation ability of the proposed method in converging to the global optimal solution. In particular, HLBO has demonstrated its high local search ability by converging to the global optimal in handling the functions and . High-dimensional multimodal functions due to the existence of multiple local optimal solutions are a suitable option for measuring the exploratory ability of optimization algorithms for global search and finding the main optimal area. The main challenge in solving these problems is to accurately scan the search space and prevent the algorithm from getting stuck in some of the optimal local areas. The results of implementing HLBO on high-dimensional multimodal functions show that the proposed approach has an acceptable exploration ability in scanning the search space and finding the optimal area. The exploratory power of HLBO in identifying the optimal region, especially in the and functions, is evident that it has been able to provide the global optimal. In addition to having the right quality of exploration and exploitation, having the right balance between these two indicators is the key to the success of optimization algorithms. Fixed dimensional multimodal functions have been selected to evaluate the ability of HLBO to strike a balance between exploration and exploitation. In this type of problem, it is important to simultaneously find the main optimal area based on global search and converge as much as possible to the global optimal based on local search. The optimization results of this type of function using the proposed approach show the high capability of HLBO in balancing exploration and exploitation to discover the optimal area and converge towards the global optimal. p-values results from Wilcoxon sum rank test. Evaluation results of IEEE CEC 2017 objective functions to . Evaluation results of IEEE CEC 2017 objective functions to .

Conclusion and future works

In this paper, a new optimization algorithm called Hybrid Leader Optimization (HLBO) is introduced. The use of a hybrid leader generated by three different members was HLBO’s idea in updating the algorithm population in the search space. The HLBO implementation process was mathematically modeled in two phases of exploration and exploitation. Twenty-three objective functions were employed to evaluate the performance of HLBO in achieving optimal solutions for optimization problems. The results of the unimodal functions indicated the high exploitation ability of HLBO to search locally and converge towards global optima. The results of optimizing multimodal functions showed the high exploration ability of HLBO to search globally and discover the optimal area without getting caught up in local optimal. For further analysis of HLBO, its efficiency in handling complex IEEE CEC 2017 objective functions was studied. The results showed that HLBO is capable of solving such optimization problems. The results of HLBO compared with the performance of ten well-known algorithms showed that HLBO has a superior performance by providing appropriate solutions in most cases due to the appropriate balance between exploration and exploitation. The proposed HLBO opens up several research subjects for further work in the future. Specific research potentials are the development of binary and multimodal versions of HLBO. The HLBO employment on optimization topics in various sciences as well as real-world applications are other suggestions for future studies. Similar to any stochastic-based optimization algorithm, there are concerns and limitations for the use of the proposed HLBO approach. Of course, we do not claim that HLBO is generally the best optimizer because according to the NFL theorem, there is no presupposition for the effective performance of an algorithm in dealing with optimization issues. It is also possible that there may be other algorithms or that new algorithms may be developed by researchers in the future that work better in some concrete applications.
  2 in total

1.  Optimization by simulated annealing.

Authors:  S Kirkpatrick; C D Gelatt; M P Vecchi
Journal:  Science       Date:  1983-05-13       Impact factor: 47.728

2.  Pelican Optimization Algorithm: A Novel Nature-Inspired Algorithm for Engineering Applications.

Authors:  Pavel Trojovský; Mohammad Dehghani
Journal:  Sensors (Basel)       Date:  2022-01-23       Impact factor: 3.576

  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.