Literature DB >> 34956353

Improved Sparrow Search Algorithm Based on Iterative Local Search.

Shaoqiang Yan1, Ping Yang1, Donglin Zhu2, Wanli Zheng1, Fengxuan Wu1.   

Abstract

This paper solves the shortcomings of sparrow search algorithm in poor utilization to the current individual and lack of effective search, improves its search performance, achieves good results on 23 basic benchmark functions and CEC 2017, and effectively improves the problem that the algorithm falls into local optimal solution and has low search accuracy. This paper proposes an improved sparrow search algorithm based on iterative local search (ISSA). In the global search phase of the followers, the variable helix factor is introduced, which makes full use of the individual's opposite solution about the origin, reduces the number of individuals beyond the boundary, and ensures the algorithm has a detailed and flexible search ability. In the local search phase of the followers, an improved iterative local search strategy is adopted to increase the search accuracy and prevent the omission of the optimal solution. By adding the dimension by dimension lens learning strategy to scouters, the search range is more flexible and helps jump out of the local optimal solution by changing the focusing ability of the lens and the dynamic boundary of each dimension. Finally, the boundary control is improved to effectively utilize the individuals beyond the boundary while retaining the randomness of the individuals. The ISSA is compared with PSO, SCA, GWO, WOA, MWOA, SSA, BSSA, CSSA, and LSSA on 23 basic functions to verify the optimization performance of the algorithm. In addition, in order to further verify the optimization performance of the algorithm when the optimal solution is not 0, the above algorithms are compared in CEC 2017 test function. The simulation results show that the ISSA has good universality. Finally, this paper applies ISSA to PID parameter tuning and robot path planning, and the results show that the algorithm has good practicability and effect.
Copyright © 2021 Shaoqiang Yan et al.

Entities:  

Mesh:

Year:  2021        PMID: 34956353      PMCID: PMC8695025          DOI: 10.1155/2021/6860503

Source DB:  PubMed          Journal:  Comput Intell Neurosci


1. Introduction

With the continuous emergence of various optimization problems, various algorithms and improved algorithms are emerging [1-4]. The emergence of swarm intelligence algorithm provides new ideas for solving various optimization problems. The swarm intelligence optimization algorithm is a meta-heuristic optimization algorithm that imitates the behavior of biological populations or biological behaviors and natural phenomena in the natural world. As the optimization effect of swarm intelligence algorithms is recognized by the public, swarm intelligence algorithms develop continuously, and more and more new swarm intelligence algorithms are proposed, such as firefly algorithm (FA) [5], ant lion optimizer (ALO) [6], whale optimization algorithm (WOA) [7], sine cosine algorithm (SCA) [8], crow search algorithm (CSA) [9], Harris hawks optimization algorithm (HHO) [10], slime mould algorithm (SMA) [11], hunger games search (HGS) [12], Runge–Kutta method (RUN) [13], and colony predation algorithm (CPA) [14]. Sparrow search algorithm (SSA) is a new group intelligence optimization algorithm proposed by Xue and Shen [15] in 2020. Inspired by sparrow foraging behavior, the algorithm has obvious advantages over traditional intelligent optimization algorithms, such as grey wolf Optimizer (GWO) [16], particle swarm optimization (PSO) [17], and genetic algorithm [18], with high stability, good search accuracy, and fast convergence [19]. Despite its fast convergence rate, the algorithm is prone to fall into local optimum and the results of optimization are random. To overcome this shortcoming, many scholars have proposed improved algorithms based on different strategies for the sparrow search algorithm and successfully solved many engineering problems. Based on the principle and model of sparrow search algorithm, Lv et al. proposed a fusion algorithm of bird swarm algorithm and sparrow search algorithm [20] and a chaotic sparrow search algorithm [21]. The former uses the search mechanism of bird swarm algorithm to add to the discoverers and followers of the sparrow search algorithm, which changes the update strategy of “full dimension reduction,” effectively breaks through the local restriction of search, and strengthens the global search ability; and the latter uses tent chaotic map to initialize the population, which makes the population more uniform. After one iteration, the second iteration of chaotic disturbance and Gaussian variation is carried out according to individual fitness and average fitness, which prevents local aggregation in the optimization process, enhances its ability to jump out of local optimization, and achieves good results in the application of image segmentation. A chaotic sparrow algorithm based on cubic mapping and elite reverse learning to initialize population is presented by Tang et al. [22]. At the same time, the sinusoidal algorithm is introduced, which balances the development and exploring ability of the algorithm. At the same time, when the algorithm comes to a standstill, the Gauss Walk strategy is used to jump out of the standstill, and its optimization performance is verified in 15 benchmark functions. Finally, the UAV track planning simulation is carried out in the case of threat. Compared with other optimization algorithms, the algorithm obtains the safe and feasible track with the best cost and meets the constraints. Ouyang et al. proposed a learning sparrow algorithm [23] adding lens reverse learning during the discoverer search phase makes the search more flexible and increases the diversity of the population. A spiral guidance mechanism is introduced to make the discoverer search more precise. Then, a local search mechanism is added to prevent the omission of high-quality solutions, and compared with other swarm intelligence algorithms in 12 basic test functions and CEC 2017 test sets, this shows good optimization ability. Finally, the improved sparrow search algorithm is validated in the robot path planning, and a stable and safe optimal path is planned. The above algorithms have made some improvements on the basis of sparrow algorithm, but there are still some shortcomings: There is still some randomness in the improved method of population initialization, which does not guarantee absolute uniformity of the population each time it is initialized. The selected improved search strategy is subject to regional limitations, is easy to exceed the boundaries, and fails to perform effective global search in the whole space, resulting in a large number of individuals exceeding the boundaries and still trapped in local optimum. By jumping directly to the discoverer, it is easy to miss the optimal solution. The improvement of local search accuracy is not significant, and there can be more improvement in search accuracy. In terms of boundary control, the strategy of updating to boundary is adopted for individuals beyond the boundary, which does not make good use of the individual location and reduces the diversity of the population. In order to solve the above problems, the sparrow search algorithm based on iterative local search is proposed in this paper. By variable helix factor and improved iterative local search, the effective utilization and search of individuals are improved. By adding a dimension by dimension lens learning strategy and change the focusing ability of the lens, the algorithm converges faster while helping to jump out of the local optimum, and improving the boundary strategy, the population diversity is increased. To verify the optimization performance of the algorithm, ISSA PSO, SCA, GWO [16], WOA, MWOA [24], SSA, BSSA, CSSA, and LSSA are tested and analyzed on 23 basic functions. To further verify the universality of the algorithm, the above algorithms are tested and analyzed in CEC 2017 test function [25,26]. Finally, ISSA is applied to PID parameter tuning [27]. The accuracy and convergence speed of the tuned results are improved compared with the SSA, which shows that the algorithm has good practicability. The main contributions of this paper are as follows: In order to improve the effective use and search of individuals, a variable helix factor strategy is proposed and boundary control is improved An improved iterative local search strategy is presented to improve the problem of low accuracy and missing better solutions in the search process In order to improve the ability of the algorithm to jump out of local optimization, a dimension by dimension lens learning method is proposed to change the lens focusing ability The versatility and flexibility of the algorithm using benchmark functions and CEC 2017 functions are validated ISSA is used to optimize PID parameters to help quickly complete PID parameter tuning ISSA is used to optimize the robot path planning problem and help get fast and stable results The main work arrangement of this paper is as follows: Section 2 introduces the basic sparrow algorithm. Section 3 introduces and analyses the ISSA. Section 4 compares and analyzes the algorithm on the basic test function. Section 5 compares and analyzes the algorithm on CEC 2017. Section 6 applies the algorithm to PID parameter tuning. Section 7 applies the algorithm to robot path planning. Section 8 discusses and provides future research directions.

2. Sparrow Search Algorithm

In the process of sparrow foraging, there are two behavioral strategies: discoverer and follower. The individuals with better positions in the sparrow population generally take 10%–20% of the total population as discoverers, while the remaining individuals take part in the process. At the same time, 10%–20% of individuals are randomly assigned as scouters. The discoverer is responsible for leading the population in search direction and finding food, while the follower follows the discoverer to obtain food, and the scouter is alert to environmental threats and warns the sparrow population to move closer to a safe area. In order to describe the process of sparrow foraging through mathematical models, it is necessary to formulate rules to simplify various behaviors of sparrows. The specific rules are as follows: The individual energy of sparrow population depends on individual fitness evaluation, and the individual energy of discoverer is higher than that of discovers. Once the scouters in the sparrow population find the threat of the external environment, they begin to send out an alarm signal. When the alert value is greater than the security threshold, the discoverers direct the population to the security zone. Sparrows have flexible individual behavior strategies and can switch between discoverers and followers. As long as the individual energy reaches a certain degree, they can become discoverers, but the proportion between discoverers and followers in the population remains unchanged. Sparrows with low energy may fly to other places for feeding in order to obtain higher energy. When there is an external environmental threat, the sparrows at the edge of the population will quickly move to the safe area, and the sparrows in the middle of the population will immediately swim away to get close to other sparrows. The discoverer is responsible for guiding the population to forage or to the location of the safe zone. The location update is described below: Among them, t represents the current number of iterations, and M is the maximum number of iterations. X represents the current position of the i-th sparrow in the j-th dimension. α ∈ [0, 1] and is a random number. R2 represents an early warning value, ST is the security threshold, and R2 ∈ [0, 1], ST ∈ [0.5, 1]. Q represents a random number that follows a normal distribution. L represents a 1 × d matrix with all elements 1. When R2 < ST, this indicates that the population environment is safe at this time, no predators are found around them, and the discoverers can conduct extensive searches to guide the population to higher energy levels. When R2 ≥ ST, this indicates that an individual within a population has discovered a predator and issued an alert, and that the discoverer quickly adjusts the search strategy to flee the current location, leading the population to a safe location. In order to obtain high-quality food, followers follow the discoverer or forage alone, so the location of followers is updated as follows: Among them, X is the best position currently occupied by the discoverer, and Xworst represents the worst position currently. A is a 1 × d matrix with only 1 or −1 elements, where A+ = A (AA)−1. When i > n/2, this indicates that the less adaptable 1st participant is not getting food, is very hungry, and needs to fly elsewhere to get more energy; when i < n/2, followers monitor the finder and compete for food with the finder with a higher predator, thereby increasing their energy. When aware of the danger, the sparrow population will make antipredation behavior, and its mathematical expression is as followswhere Xbest represents the current global optimal position. β is the control step parameter and is a normally distributed random number with a mean value of 0 and a variance of 1. K is a random number belonging to [−1, 1], which controls the direction of the sparrow's movement as well as the step. f represents the fitness value of the current sparrow individual. f and f are the current optimal and worst fitness values, respectively. ε is a very small real number that prevents the denominator from being zero. When f ≠ f, this indicates that the current sparrow is at the edge of the population and that an individual is vulnerable to predators, and it is necessary to approach other individuals in the population center to reduce the risk of predation. When f = f, this indicates that individuals in the center of the population are aware of the danger and need to flee from their current location in order to avoid it.

3. Sparrow Search Algorithm Based on Iterative Local Search (ISSA)

3.1. Variable Helix Factor

Followers occupy the majority of individuals in the population. When i < (N/2), they have a unique update mechanism that draw closer to the discoverers quickly; it results in a fast convergence rate. When i > (N/2), they have the ability to search globally. However, the global search ability is not strong, limited by the boundary of the search area, which tends to cause aggregation at the boundary in the early stage, resulting in loss of population diversity, easy to fall into the local extreme phenomenon, and poor ability to jump out of the local optimal. The location update method of follower adopts the random coefficient that obeys the normal distribution. Without considering the boundary, the coefficient has strong global search ability. However, when considering the boundary, this mechanism is detrimental to the individuals who are at or near the boundary. Many absolute coefficients exceed 1. When the boundary is exceeded, it causes the population individuals to aggregate at the boundary, does not make full use of the current location, and results in a significant decrease in population diversity and overall algorithm performance. Based on this, a variable helix factor is proposed to reduce the number of individuals beyond the boundary, control the search step and direction, make full use of the whole population space, the space for early search is large, maintain the diversity of the population, and help jump out of the trap of local optimum. Later local search is more detailed, which greatly improves the search ability of the algorithm, as shown in Figure 1.
Figure 1

Coefficient model. (a) Original random coefficient. (b) Variable helix factor.

The formula for the variable helix factor works as follows: H is a variable helix factor, a is a parameter used to control the helix, with a value of 1 in the earlier period and a decreasing number of iterations in the later period; k is a parameter representing the helix cycle, (M/10) in general; and l is a parameter that decreases linearly from 1 to −1 in terms of the number of iterations. In this paper, we select a high-dimensional test function, step function, and a low-dimensional test function, Shekel function, to test the original model and the improved model for individuals beyond the boundary, as shown in Figure 2. Set the population individual as 30 and the number of iterations as 500, run each for 10 times, and take the average value. The test results are shown in Table 1.
Figure 2

Parameter space. (a) Step function. (b) Shekel function.

Table 1

Statistics on the number of individuals beyond the boundary.

FunctionStepShekel
Dimension30 dim4 dim
Boundary 1.281.28 010
Number of times the original algorithm follower exceeded the boundary15053739
Number of times the follower exceeded boundaries after improvement00
Total number of times the original algorithm exceeded the boundary16964381
Total number of times the improved algorithm exceeded the boundary1611095
According to Table 1, when the followers conduct extensive search, the number beyond the boundary accounts for the vast majority of the total number beyond the boundary of the sparrow population, resulting in the loss of a large number of individuals. Therefore, it is necessary to restrict the extensive search scope of the followers; based on the variable helix factor, the number of times the followers exceeding the boundary is greatly reduced to 0, which fully retains the favorable position information of the current individual. Improvements to the extensive search of the followers make it possible for the followers to make full use of the whole search space, get rid of the attraction of the local optimal solution more easily, strengthen the search for the whole space, maintain the diversity of the population, enhance the ability of algorithm exploration in the early stage, and enhance the ability of algorithm development in the later stage. Based on this, the formula is updated as follows:

3.2. Improved Iterative Local Search

When i < (N/2), the followers have a unique update mechanism that quickly closes to the discoverer's optimal solution, which results in fast convergence of the algorithm. The followers jump directly to the neighborhood of the discoverer's optimal solution. Although they have some development ability near the current optimal solution, they do not make enough use of the current solution and have poor stability. They cannot guarantee the quality and accuracy of the solution and have poor local development ability. Once trapped and unable to jump out of the local extreme state, the overall performance of the algorithm will be limited. Inspired by [28-30], this paper presents an improved iterative local search. Local search algorithm [31] is a simple greedy search algorithm that is improved from the hill-climbing method. Local search starts from an initial solution, then searches the neighborhood of the solution, and updates the solution if there is a better solution or returns to the current solution. Iterative local search is an exploratory method that adds perturbation to the local optimal solution obtained by local search and then re-searches the local solution. The improved iterative local search first performs a local search near the initial solution, then disturbs the initial solution by updating the location of the followers closer to the discoverer, and then searches the updated location again. It makes full use of the location information of the current individual and the location information of the current optimal solution to make the search more flexible. The first modification uses the mechanism by which the follower approaches the discoverer to disturb, so that the individual jumps near the current global optimal position to jump out of the local optimal position; the second modification uses the local search to obtain stable optimization results and improve the accuracy of the local search; and the third modification uses the local search result to obtain the optimal local search result to ensure the quality of the solution, as shown in Figure 3.
Figure 3

Main idea of improved iterative local search.

At the same time, two cases are illustrated in this paper. Figure 3(a) indicates that after individual disturbance, a better solution can be found near the current global optimal solution; and Figure 3(b) indicates that an individual can find a better solution using his own favorable position, both of which help to jump out of the local optimal solution. This paper has not been replaced by the current global optimal solution to maintain population diversity and prevent premature convergence.

3.2.1. The First Modification (SSA Method)

The improved iterative local search strategy is more suitable for this algorithm than the original iterative local search. It works by first disturbing the initial solution to get an intermediate solution (when i < n/2, the followers of the SSA are disturbed by a unique update mechanism that closes quickly to the finder's optimal solution). The initial and intermediate solutions are then searched again for a better solution. The first modification is used to solve the problem that the SSA is easy to fall into local optimal solution. The algorithm flow is as follows:

Step 1 .

The initial solution X is perturbed by the current optimal solution Xbest to obtain the intermediate solution X. The perturbation formula is as follows: The formula for the original followers is as follows:Xbest is x, which is the best position the discoverer currently occupies; X is x, which is the current position; X is x, which is the updated location.

3.2.2. The Second Modification (ILS Algorithm)

The second modification solves the problem of unstable and inaccurate optimization results of the SSA. The ILS algorithm here represents the local search stage in the ILS algorithm. It works by searching the initial and intermediate solutions locally to get a better solution and effectively utilizes the current position of the initial solution to prevent the individual from missing the better solution in the process of jumping directly to the current optimal solution. At the same time, local search near the current optimal solution helps to improve the accuracy of the solution and jump out of the local optimum in a small range. The algorithm flow is as follows:

Step 2 .

Initial solution X starts local search, and the formula is as follows:rand( ) is a random number between 0 and 1.

Step 3 .

Intermediate solution X starts local search, and the formula is as follows:

3.2.3. The Third Modification

The third modification is to optimize the local search results to ensure the quality of the solution. The working principle is to use greedy strategy to compare the local search results of the initial and intermediate solutions and select the better value as the final solution X. The algorithm flow is as follows:

Step 4 .

Calculate fitness f(X1) of X1.

Step 5 .

Calculate fitness f(X2) of X2

Step 6 .

Compare f(X1) and f(X2) to select the best individuals for location updates; that is, If f(X1) ≤ f(X2) X = X1 else X = X2 end Based on the SSA and the three modifications above, the formula of the follower (i < N/2) in the SSA is updated as follows:

3.3. Dimension by Dimension Lens Imaging Learning

Swarm intelligence algorithms have the disadvantage of easily falling into local optimum. In this regard, relevant scholars have proposed the method of adding reverse learning to swarm intelligence algorithm [32-34]. The solution after reverse learning can be closer to the optimal solution. Generally, reverse learning can only search for the optimal solution in a certain space, but it still has monotonicity and the possibility of falling into local optimum. Lens learning [35, 36] has better optimization ability than general direction learning and can continuously converge to the optimal solution in a certain space. However, once there is no optimal solution in the selected space, it will still lead to local optimization in the end. At the same time, scouters have antipredator behavior to help the population to jump out of the local optimum, but their ability to jump out of the local optimum is unstable, resulting in sometimes unable to jump out of the local optimum. In view of this phenomenon, this paper proposes a dimension by dimension lens imaging learning strategy to change the focusing ability of the lens, which is used to strengthen the scouters' ability to jump out of the local optimum, lens learning for each dimension, and reduce the mutual interference between the dimensions. In the early stage, the lens with poor focusing ability is selected for reverse learning. At this time, the imaging is divergent and far away from the lens, which can help jump out of the local optimum, as shown in Figure 4(a); in the later stage, the lens with strong focusing ability is selected for reverse learning. The imaging is concentrated and close to the lens, which can accelerate the convergence, as shown in Figure 4(b). Comparison of three kinds of reverse learning is shown in Table 2.
Figure 4

Main principles of lens imaging.

Table 2

Comparison of three kinds of reverse learning.

BoundaryFocusing ability (k)Reverse solution positionEffect
Reverse learningUnchanged1On boundary midpoint symmetryAccelerating convergence
Lens imaging learningDynamic change according to the maximum and minimum of individual position of populationA large constant, which is more than 1About boundary midpoint reduction imagingAccelerating convergence
Dimension by dimension lens imaging learningThe maximum and minimum values of each dimension change dynamically according to the individual position of the populationAccording to the dynamic change of iteration times, the first and middle stages are less than 1, and the later stage is more than 1In the early and middle stages, the image is enlarged at the midpoint of the boundary, and in the later stage, the image is smallerJump out of the local optimum in the first and middle stage and accelerate the convergence in the later stage
The principle of lens imaging is as follows: As shown in Figure 4, taking one-dimensional space as an example, it is assumed that there is an individual with a height of h at the position. Under the action of the lens, it forms an image with a height of h at the position x. a, b is the boundary, and the lens position is the midpoint of [a, b]. According to the principle of lens imaging and triangle similarity principle: By transforming the above formula, we can getk is used to indicate the focusing ability of the lens, that is, the imaging size. When k = 1, it can be simplified as follows: This is the general reverse learning strategy. It can be seen that the general reverse learning strategy is a special case of lens imaging. The general reverse learning strategy k is fixed, and the obtained individuals are also fixed. Lens learning can change the position of individuals by adjusting k, so as to further enhance the diversity of groups. Generally, it takes a constant that is not equal to 1. This paper proposes a strategy of linearly increasing K according to the number of iterations, that iswhere a is a small constant to prevent the previous iterative imaging from being too large, which is taken as 0.1 in this paper. The early k is small, and the imaging is large; in the later stage, k is near 1, and the imaging is slightly smaller, which can help convergence. At the same time, lens imaging is extended to each dimension, and lens imaging reverse learning is performed for each dimension. The formula is extended as follows:where j is the current dimension, a is the lower bound of the j-th dimension, and b is the upper bound of the j-th dimension. At the same time, this paper adopts dynamic boundary:where min(x) is the minimum value of the j-th dimension in all individuals, and max(x) is the maximum value of the j-th dimension in all individuals. Because a and b do not represent the boundary of the whole search space, when the imaging exceeds the boundary [a, b], it may not exceed the boundary of the whole search space. Therefore, when k is small in the current period, the imaging will exceed the boundary of the current j-th dimension, which helps to expand the search range, reduces the possibility of premature stagnation in the early stage, and helps to jump out of the local optimum. Finally, the greedy strategy is adopted. If the fitness value of the reverse solution is small and better than the original solution, the solution is updated and applied to the algorithm as follows:

3.3.1. Verify the Ability to Jump Out of Local Optimization

In this paper, the Shekel function mentioned above is selected as an example to draw the individual distribution diagram between the improved algorithm and the original algorithm to verify the ability of the above strategy to jump out of local optimization. The function image is shown in Figure 2. The selected population is 100, and the maximum number of iterations is 20. The individual distribution of the two algorithms are shown in Figures 5 and 6.
Figure 5

Individual distribution of SSA. (a) SSA individual initialization map. (b) Individual distribution of SSA in 20 generations.

Figure 6

Individual distribution of ISSA. (a) ISSA individual initialization map. (b) Individual distribution of ISSA in 20 generations.

As can be seen from Figure 5, most individuals in the original algorithm have local aggregation and fall into local optimization. As can be seen from Figure 6, the improved algorithm has a larger search space, and most individuals converge near the optimal solution, effectively jumping out of the local optimal solution.

3.3.2. Proof of Convergence of Dimension by Dimension Lens Imaging Learning

The proof of convergence of general refraction reverse learning swarm intelligence algorithm is given in reference [37]. Here, its conclusion is introduced to prove the convergence of the ISSA for dimensional lens back learning. It should be pointed out that the proof of convergence does not necessarily ensure that the algorithm converges to the global optimal solution. Since the SSA is also a swarm intelligence search algorithm, there are the following theorems:

Theorem 1 .

If the SSA algorithm based on general back learning converges, the ISSA algorithm also converges.

Proof

Let x(t) and x(t) be the current solution and reverse solution of generation t, x(t) and x(t) be the values of x(t) and x(t) in the j-th dimension, respectively, and the global optimal solution is xbest. From Theorem 1, Extended to the j-th dimension: Due to a(t)=min(x(t)), b(t)=max(x(t)), we get In the t-th generation, the current solution generated by the reverse learning strategy based on the lens imaging principle is When t⟶∞, from the above formula, we get Return to the whole dimension: It can be seen that when x(t) converges to xbest, the inverse solution x(t) generated by lens learning also converges to xbest. Therefore, if the SSA based on general reverse learning converges, the ISSA also converges.

3.4. Improved Boundary Control

In the standard SSA, when an individual in the optimization process exceeds the boundary, the boundary control will be carried out for the individual. The principle is as follows: Ub and Lb are the upper and lower bounds of space, respectively. In this method, the strategy of turning the individuals beyond the boundary into the boundary will lead to the aggregation of individuals at the boundary and reduce the diversity of the population. It can be seen from Table 1 that although the improved strategy reduces the number of individuals beyond the boundary, some individuals still exceed the boundary and gather at the boundary. Literature [38] adopts the strategy of the current optimal solution for individuals beyond the boundary, which will be difficult to get rid of the local optimization. Therefore, this paper proposes a simple boundary treatment method, namely, In this method, the individuals beyond the boundary are randomly assigned to the search space, which makes more effective use of the population individuals and increases the diversity of the population more than that of the original algorithm.

3.5. Improved Sparrow Search Algorithm Flow

In this paper, an improved sparrow search algorithm based on iterative local search is proposed. Firstly, the variable helix factor is used to improve the extensive search of followers, which reduces the individuals beyond the boundary and speeds up the convergence speed in the later stage. Secondly, the improved iterative local search is used to improve the local search of the followers. The initial solution is subject to local search and iterative local search after disturbance, which makes full use of the current position information to prevent premature convergence and improve the quality and accuracy of understanding. Then, the lens with changed focusing ability is used to carry out dimension by dimension lens imaging learning for the scouter, and it increases the search space and helps the population jump out of the local optimum. Finally, the boundary control strategy is improved to reduce the aggregation of individuals at the boundary and increase the diversity of the population. The introduction of various strategies makes the algorithm more flexible in the optimization ability, makes the population more diverse, strengthens its ability to get rid of local optimization, balances the global search ability and local search ability of the algorithm, and is conducive to finding reliable solutions. The specific algorithm flow is shown in Figure 7. The specific pseudo code is as follows:
Figure 7

Algorithm flow chart.

3.6. Time Complexity Analysis

Time complexity is an important index to measure the performance of the algorithm, which is used to measure the running time of the algorithm. Assuming that the population size of the algorithm is P, the maximum number of iterations is M and the dimension is D, and the time complexity of the sparrow search algorithm is O (P·M·D). From a macro point of view, the improved sparrow search algorithm does not change the structure and cycle times of the algorithm, so its time complexity is also O (P·M·D), which is consistent with the original algorithm. From the microscopic point of view, the greedy strategy is adopted for iterative local search and dimension by dimension lens learning, which increases the algorithm complexity of some followers and all scouters to a certain extent, but the introduction of the improved strategy does not improve the order of magnitude of the algorithm, so the time complexity is still O (P·M·D).

4. Benchmark Function Test

In order to better verify the performance of ISSA, this paper selects 23 common basic test functions for verification and tests and compares them with 10 algorithms including PSO, SCA, GWO, WOA, MWOA, SSA, BSSA, CSSA, and LSSA. The specific parameter settings are shown in Table 3, and the test function information is shown in Table 4. F1–F7 is the high-dimensional single-peak benchmark function, F8–F13 is the high-dimensional multipeak benchmark function, and F14–F23 is the low-dimensional multipeak benchmark function. F1–F13 is tested in 100 dimensions to verify the performance of the algorithm in higher dimensions. For the sake of fairness, the population size and maximum number of iterations of each algorithm are 30 and 500, respectively, and each algorithm is run independently for 30 times to calculate its best value (best), worst value (worst), average value (AVE), and standard deviation (STD), and the optimal value of each index is processed in bold. Finally, each algorithm will be ranked according to the average value of the algorithm in the function. When the average values of the two algorithms are equal, the standard deviation will be compared. For performance evaluation, simulations are performed on Windows 10 Matlab 2016a, AMD Ryzen 7 4800U with Radeon Graphics @1.80 GHz with 16 GB RAM.
Table 3

Parameter.

AlgorithmPSOSCAGWOWOAMWOASSABSSACSSALSSAISSA
Parameter c1 = 2c2 = 2Wmix = 0.2Wmax = 0.9 a = 2 a = (2 ⟶ 0) b = 1 b = 1n = 12000ST = 0.8PD = 0.2SD = 0.2ST = 0.8PD = 0.2SD = 0.2ST = 0.8PD = 0.2SD = 0.2ST = 0.8PD = 0.2SD = 0.2ST = 0.8PD = 0.2SD = 0.2a = 0.1
Table 4

Test function.

FunctionDimensionsIntervalMin
F 1(x)=∑i=1nxi230/100[−100, 100]0
F 2(x)=∑i=1n|xi|+∏i=1n|xi|30/100[−100, 100]0
F 3(x)=∑i=1n(∑j=1ixj)230/100[−100, 100]0
F 4(x)=maxi{|xi|, 1 ≤ in}30/100[−100, 100]0
F 5(x)=∑i=1n−1[100(xi+1xi2)2+(xi − 1)2]30/100[−30, 30]0
F 6(x)=∑i=1n([xi+0.5])230/100[−100, 100]0
F 7(x)=∑i=1nixi4+random[0,1)30/100[−1.28, 1.28]0
F8x=i=1nxisinxi 30/100[−500, 500]−418.98n
F 9(x)=∑i=1n[xi2 − 10 cos(2πxi)+10]30/100[−5.12, 5.12]0
F10x=20exp0.21/ni=1nxi2exp1/ni=1ncos2πxi+20+e 30/100[−32, 32]0
F11x=1/4000i=1nxi2i=1ncosxi/i+1 30/100[−600, 600]0
F 12(x)=(π/n){10 sin(πy1)+∑i=1n−1(yi − 1)2[1+10 sin2(πyi+1)]+(yn − 1)2}+∑i=1nu(xi, 10,100,4)30/100[−50,50]0
y i =1+((xi+1)/4)
uxi,a,k,m=kxiamxi>a0a<xi<akxiamxi<a
F 13(x)=0.1{sin2(3πx1)+∑i=1n(xi − 1)2[1+ sin2(3πxi+1)]+(xn − 1)2[1+ sin2(2πxn)]}+∑i=1nu(xi, 5,100,4)30/100[−50,50]0
F 14(x)=((1/500)+∑j=1251/(j+∑i=12(xiaij)6))−12[−65.536, 65.536]0.998
F 15(x)=∑i=111(ai − (x1(bi2+b1x2)/(bi2+b1x3+x4)))24[−5, 5]0.0003
F 16(x)=4x12 − 2.1x14+(1/3)x16+x1x2 − 4x22+4x242[−5, 5]−1.032
F 17(x)=(x2 − (5.1/4π2)x1 − 6)2+10(1 − (1/8π))cos x1+102[−5, 5]0.3979
F 18(x)=[1+(x1+x2+1)2(19 − 14x1+3x12 − 14x2+6x1x2+3x22)] × [30+(2x1 − 3x2)2(18 − 32x1+12x12+48x2 − 36x1x2+27x22)]2[−2, 2]3
F 19(x)=∑i=14ciexp(−∑j=13aij(xjpij)2)3[0,1]−3.863
F 20(x)=∑i=14ciexp(−∑j=16aij(xjpij)2)6[0, 1]−3.32
F 21(x)=∑i=15[(Xai)(Xai)T+ci]−14[0, 10]−10.1532
F 22(x)=∑i=17[(Xai)(Xai)T+ci]−14[0, 10]−10.4029
F 23(x)=∑i=110[(Xai)(Xai)T+ci]−14[0, 10]−10.5364
ISSA ranks first in most functions, and its average ranking is better than that of other algorithms. It can be seen from Tables 5 and 6 that, compared with other algorithms, ISSA has found the theoretical optimal value except F7, F10, F12, F13, and F15 and has found the optimal value in all algorithms in the above five functions. It can be seen that the ISSA has good ability to find the optimal solution. Among them, only the ISSA in F5 finds the theoretical optimal value, and only SSA and ISSA in F6 find the optimal value. The optimization accuracy of ISSA in F12 and F13 is a large number of orders of magnitude higher than that of other algorithms. When the dimension increases to 100, the optimization performance of ISSA is still very stable, while PSO, SCA, WOA, GOA, and SSA have a great impact on the optimization ability. It is worth mentioning that in F6 and F12, ISSA is only better than SSA and SSA improved the algorithm in terms of optimal value, but other indicators are better than other swarm intelligence algorithms. In F1–F13 function, ISSA finds a better solution than or the same solution as SSA, indicating that ISSA does not reduce the optimization ability of the algorithm. In the fixed dimension F14–F23, ISSA algorithm has good stability and the ability to jump out of the local optimum. It can find the solution close to the theoretical optimum almost every time.
Table 5

Comparison table of the optimization effect of each algorithm (30 dimensions and fixed dimensions).

F IndexPSOSCAGWOWOAMWOASSABSSACSSALSSAISSA
F1Best3.039E − 082.140E − 045.309E − 403.217E − 1062.455E – 106 0 5.308E − 269 0 0 0
Worst2.030E − 061.076E + 019.250E − 383.331E − 941.392E − 95 0 9.853E − 249 0 0 0
Ave2.356E − 078.143E − 011.680E − 381.167E − 951.526E − 96 0 3.337E − 250 0 0 0
Std3.697E − 072.128E + 002.379E − 386.075E − 953.710E − 96 0 0 0 0 0
Rank91087615111
F2Best5.760E − 054.632E − 053.354E − 232.808E − 631.441E − 62 0 1.827E − 136 0 2.123E − 263 0
Worst2.620E − 031.803E − 025.443E − 227.961E − 573.663E − 555.557E − 2311.955E − 126 0 2.174E − 169 0
Ave4.306E − 044.924E − 031.410E − 221.014E − 571.594E − 562.250E − 2321.332E − 127 0 7.253E − 171 0
Std4.897E − 044.613E − 031.008E − 222.073E − 576.709E − 56 0 4.462E − 127 0 0 0
Rank91086735141
F3Best4.148E + 017.288E + 021.144E − 071.056E + 042.370E + 04 0 6.239E − 263 0 0 0
Worst1.465E + 021.962E + 043.338E − 035.602E + 046.525E + 046.949E − 593.651E − 232 0 1.751E − 244 0
Ave9.521E + 017.423E + 031.535E − 043.870E + 044.355E + 042.316E − 601.217E − 233 0 7.297E − 246 0
Std3.092E + 014.667E + 036.126E − 041.181E + 041.007E + 041.269E − 59 0 0 0 0
Rank78691054131
F4Best6.129E − 011.282E + 019.960E − 083.426E + 004.515E − 022.102E − 2933.783E − 130 0 1.942E − 210 0
Worst1.677E + 006.014E + 015.466E − 068.940E + 018.833E + 018.277E − 366.088E − 1201.338E − 1821.789E − 142 0
Ave1.077E + 003.790E + 011.135E − 064.785E + 014.205E + 012.774E − 374.300E − 1214.720E − 1846.026E − 144 0
Std2.676E − 011.281E + 011.271E − 062.867E + 013.029E + 011.511E − 361.332E − 120 0 3.265E − 143 0
Rank78610954231
F5Best2.564E + 014.001E + 012.581E + 012.707E + 012.701E + 012.124E − 142.502E − 081.176E − 098.738E − 10 0
Worst1.817E + 022.485E + 052.874E + 012.877E + 012.876E + 013.498E − 052.572E − 045.266E − 041.097E − 03 0
Ave8.783E + 012.463E + 042.704E + 012.794E + 012.795E + 014.175E − 065.466E − 052.864E − 055.689E − 05 0
Std4.634E + 014.827E + 046.734E − 014.754E − 014.112E − 019.187E − 067.765E − 051.005E − 042.064E − 04 0
Rank91067824351
F6Best7.636E − 065.178E + 006.012E − 074.835E − 026.880E − 02 0 3.524E − 111.360E − 123.385E − 10 0
Worst1.739E − 035.039E + 011.255E + 009.709E − 011.047E + 007.534E − 061.103E − 07 5.188E − 08 3.010E − 077.059E − 04
Ave2.531E − 041.420E + 017.218E − 014.287E − 013.361E − 013.019E − 071.788E − 08 6.178E − 09 3.970E − 081.409E − 04
Std3.327E − 041.305E + 013.225E − 012.176E − 012.303E − 011.372E − 062.881E − 08 1.170E − 08 7.458E − 081.965E − 04
Rank61098742135
F7Best7.982E − 021.117E − 023.244E − 045.252E − 055.025E − 051.287E − 052.264E − 051.056E − 051.083E − 05 8.231E − 06
Worst3.788E − 012.485E − 014.318E − 031.108E − 021.460E − 021.062E − 039.727E − 04 4.982E − 04 8.427E − 045.115E − 04
Ave1.778E − 018.566E − 021.795E − 032.821E − 033.241E − 032.440E − 042.722E − 041.632E − 042.355E − 04 1.506E − 04
Std6.799E − 026.210E − 021.129E − 032.905E − 033.662E − 032.364E − 042.438E − 041.211E − 041.908E − 04 1.196E − 04
Rank10967845231
F8Best−8.128E + 03−4.963E + 03−7.948E + 03−1.257E + 04−1.257E + 04 −1.257E + 04 −1.030E + 04−1.257E + 04−1.257E + 04 −1.257E + 04
Worst−3.967E + 03−3.578E + 03−3.642E + 03−8.304E + 03−7.837E + 03−7.673E + 03−7.671E + 03−8.952E + 03−1.033E + 04 −1.083E + 04
Ave−6.667E + 03−4.023E + 03−6.361E + 03−1.153E + 04−1.142E + 04−1.060E + 04−8.891E + 03−1.128E + 04−1.182E + 04 −1.245E + 04
Std9.060E + 023.243E + 027.713E + 021.520E + 031.230E + 031.948E + 035.289E + 028.299E + 026.529E + 02 3.565E + 02
Rank81093467521
F9Best2.609E + 018.825E − 01 0 0 0 0 0 0 0 0
Worst8.661E + 012.056E + 021.045E + 01 0 1.137E − 13 0 0 0 0 0
Ave5.757E + 013.868E + 011.057E + 00 0 3.790E − 15 0 0 0 0 0
Std1.410E + 014.056E + 012.499E + 00 0 2.076E − 14 0 0 0 0 0
Rank10981711111
F10Best2.946E − 032.093E − 027.905E − 14 8.882E − 16 8.882E − 16 8.882E − 16 8.882E − 16 8.882E − 16 8.882E − 16 8.882E − 16
Worst1.341E + 002.035E + 012.780E − 137.994E − 157.994E − 15 8.882E − 16 8.882E − 16 8.882E − 16 8.882E − 16 8.882E − 16
Ave2.508E − 011.104E + 011.554E − 133.849E − 155.033E − 15 8.882E − 16 8.882E − 16 8.882E − 16 8.882E − 16 8.882E − 16
Std4.770E − 019.347E + 004.636E − 142.483E − 152.653E − 15 0 0 0 0 0
Rank91086711111
F11Best1.262E − 064.291E − 01 0 0 0 0 0 0 0 0
Worst3.694E − 022.192E + 002.243E − 022.284E − 01 0 0 0 0 0 0
Ave9.120E − 031.013E + 004.558E − 037.613E − 03 0 0 0 0 0 0
Std1.036E − 022.950E − 017.488E − 034.170E − 02 0 0 0 0 0 0
Rank91078111111
F12Best4.722E − 081.373E + 006.907E − 034.668E − 032.905E − 035.172E − 193.481E − 134.456E − 132.676E − 09 1.571E − 32
Worst1.037E − 013.061E + 061.258E − 013.701E − 012.246E − 011.549E − 078.831E − 09 3.088E − 09 6.638E − 079.777E − 05
Ave6.914E − 031.092E + 054.088E − 023.888E − 022.754E − 027.364E − 098.349E − 10 2.263E − 10 1.373E − 072.558E − 05
Std2.630E − 025.587E + 052.534E − 026.678E − 023.888E − 022.905E − 081.843E − 09 5.697E − 10 1.673E − 073.011E − 05
Rank61098732145
F13Best4.356E − 065.547E + 001.045E − 012.076E − 011.165E − 011.010E − 211.163E − 112.611E − 161.351E − 15 1.350E − 32
Worst1.152E − 023.410E + 069.183E − 011.069E + 001.303E + 002.333E − 072.900E − 071.083E − 103.592E − 11 1.350E − 32
Ave5.321E − 032.239E + 055.196E − 016.395E − 015.154E − 012.072E − 082.126E − 081.612E − 113.754E − 12 1.350E − 32
Std5.498E − 037.069E + 051.948E − 012.354E − 012.762E − 015.998E − 086.089E − 082.840E − 118.284E − 12 0
Rank61089745321
F14Best 9.980E − 01 9.980E − 01 9.980E − 01 9.980E − 01 9.980E − 01 9.980E − 01 9.980E − 01 9.980E − 01 9.980E − 01 9.980E − 01
Worst1.172E + 011.076E + 011.267E + 011.076E + 011.076E + 011.267E + 011.267E + 01 9.980E − 01 1.267E + 011.267E + 01
Ave4.147E + 002.251E + 004.065E + 002.767E + 003.907E + 008.622E + 001.086E + 01 9.980E − 01 1.585E + 001.453E + 00
Std3.028E + 001.888E + 004.163E + 002.953E + 003.791E + 005.438E + 004.143E + 00 4.517E − 16 2.179E + 002.149E + 00
Rank84756910132
F15Best3.084E − 044.873E − 043.075E − 043.081E − 043.217E − 04 3.075E − 04 3.075E − 04 3.075E − 04 3.075E − 04 3.075E − 04
Worst1.093E − 031.506E − 032.036E − 025.941E − 032.252E − 033.447E − 046.256E − 043.205E − 04 3.155E − 04 3.319E − 04
Ave8.461E − 041.002E − 034.462E − 031.216E − 036.588E − 043.098E − 043.204E − 04 3.079E − 04 3.080E − 043.100E − 04
Std1.588E − 043.287E − 048.091E − 031.271E − 034.499E − 047.828E − 065.824E − 052.380E − 06 1.552E − 06 6.408E − 06
Rank78109635124
F16Best −1.032E + 00 −1.032E + 00 −1.032E + 00 −1.032E + 00 −1.032E + 00 −1.032E + 00 −1.032E + 00 −1.032E + 00 −1.032E + 00 −1.032E + 00
Worst −1.032E + 00 −1.031E + 00 −1.032E + 00 −1.032E + 00 −1.032E + 00 −2.155E − 01 −1.032E + 00 −1.032E + 00 −1.032E + 00 −1.032E + 00
Ave −1.032E + 00 −1.032E + 00 −1.032E + 00 −1.032E + 00 −1.032E + 00 −1.004E + 00 −1.032E + 00 −1.032E + 00 −1.032E + 00 −1.032E + 00
Std 0 7.279E − 05 0 0 0 1.490E − 01 0 0 0 0
Rank19111101111
F17Best3.979E − 013.979E − 013.979E − 013.979E − 013.979E − 01 3.979E − 01 3.979E − 01 3.979E − 01 3.979E − 01 3.979E − 01
Worst3.979E − 014.108E − 013.985E − 013.980E − 013.980E − 01 3.979E − 01 3.979E − 01 3.979E − 01 3.979E − 01 3.979E − 01
Ave 3.979E − 01 3.994E − 01 3.979E − 01 3.979E − 01 3.979E − 01 3.979E − 01 3.979E − 01 3.979E − 01 3.979E − 01 3.979E − 01
Std 0 2.363E − 031.095E − 041.569E − 052.801E − 05 0 0 0 0 0
Rank11097811111
F18Best 3.000E + 00 3.000E + 00 3.000E + 00 3.000E + 00 3.000E + 00 3.000E + 00 3.000E + 00 3.000E + 00 3.000E + 00 3.000E + 00
Worst 3.000E + 00 3.000E + 00 3.000E + 00 3.000E + 00 3.000E + 00 3.000E + 01 3.000E + 00 3.000E + 00 3.000E + 00 3.000E + 00
Ave 3.000E + 00 3.000E + 00 3.000E + 00 3.000E + 00 3.000E + 00 4.800E + 00 3.000E + 00 3.000E + 00 3.000E + 00 3.000E + 00
Std 0 0 0 0 0 6.850E + 00 0 0 0 0
Rank11111101111
F19Best −3.863E + 00 −3.862E + 00 −3.863E + 00 −3.863E + 00 −3.863E + 00 −3.863E + 00 −3.863E + 00 −3.863E + 00 −3.863E + 00 −3.863E + 00
Worst −3.863E + 00 −3.854E + 00−3.856E + 00−3.861E + 00−3.860E + 00 −3.863E + 00 −3.863E + 00 −3.863E + 00 −3.863E + 00 −3.863E + 00
Ave −3.863E + 00 −3.855E + 00−3.862E + 00−3.862E + 00−3.862E + 00 −3.863E + 00 −3.863E + 00 −3.863E + 00 −3.863E + 00 −3.863E + 00
Std3.162E − 151.785E − 031.528E − 036.637E − 046.423E − 04 0 0 0 0 0
Rank61078911111
F20Best −3.322E + 00 −3.224E + 00 −3.322E + 00 −3.322E + 00 −3.322E + 00 −3.322E + 00 −3.322E + 00 −3.322E + 00 −3.322E + 00 −3.322E + 00
Worst −3.203E + 00 −2.991E + 00−3.138E + 00−3.082E + 00−3.087E + 00 −3.203E + 00 −3.203E + 00 −3.203E + 00 −3.203E + 00 −3.203E + 00
Ave−3.270E + 00−3.078E + 00−3.247E + 00−3.258E + 00−3.243E + 00−3.263E + 00−3.286E + 00−3.263E + 00−3.247E + 00 −3.318E + 00
Std5.993E − 026.050E − 026.344E − 027.915E − 026.898E − 026.047E − 025.542E − 026.047E − 025.828E − 02 2.171E − 02
Rank31076942581
F21Best−1.015E + 01−5.819E + 00−1.015E + 01−1.015E + 01−1.015E + 01−1.015E + 01 −1.015E + 01 −1.015E + 01 −1.015E + 01 −1.015E + 01
Worst−2.631E + 00−4.973E − 01−2.683E + 00−4.984E + 00−2.615E + 00−5.055E + 00−9.996E + 00 −1.015E + 01 −1.015E + 01 −1.015E + 01
Ave−7.630E + 00−2.164E + 00−9.088E + 00−8.265E + 00−8.515E + 00−9.303E + 00−1.013E + 01 −1.015E + 01 −1.015E + 01 −1.015E + 01
Std3.218E + 001.734E + 002.466E + 002.490E + 002.753E + 001.932E + 003.978E − 02 0 0 0
Rank91068754111
F22Best −6.207E + 00 −1.040E + 01 −1.040E + 01 −1.040E + 01 −1.040E + 01 −1.040E + 01 −1.040E + 01 −1.040E + 01 −1.040E + 01 −1.040E + 01
Worst−9.028E − 01−5.129E + 00−1.838E + 00−1.835E + 00−5.088E + 00−5.088E + 00 −1.040E + 01 −1.040E + 01 −5.088E + 00 −1.040E + 01
Ave−2.986E + 00−1.023E + 01−7.664E + 00−7.782E + 00−8.985E + 00−9.850E + 00 −1.040E + 01 −1.040E + 01 −9.340E + 00 −1.040E + 01
Std1.774E + 009.629E − 013.007E + 003.125E + 002.391E + 001.615E + 00 0 0 2.162E + 00 0
Rank10498751161
F23Best −1.054E + 01 −8.411E + 00 −1.054E + 01 −1.054E + 01 −1.054E + 01 −1.054E + 01 −1.054E + 01 −1.054E + 01 −1.054E + 01 −1.054E + 01
Worst−2.422E + 00−9.439E − 01−5.129E + 00−2.422E + 00−2.803E + 00−5.129E + 00−1.026E + 01 −1.054E + 01 −1.054E + 01 −1.054E + 01
Ave−9.395E + 00−3.822E + 00−1.018E + 01−7.221E + 00−8.252E + 00−9.815E + 00−1.051E + 01 −1.054E + 01 −1.054E + 01 −1.054E + 01
Std2.644E + 001.569E + 001.366E + 003.235E + 002.843E + 001.870E + 007.007E − 02 0 0 0
Rank71059864111
Average rank4.605.934.764.454.382.932.141.171.721.10
Table 6

Comparison table of the optimization effect of each algorithm (100 dimensions).

F IndexPSOSCAGWOWOAMWOASSABSSACSSALSSAISSA
F1Best9.872E + 009.308E + 022.693E − 124.175E − 832.894E − 84 0 2.714E − 255 0 0 0
Worst3.569E + 012.981E + 041.824E − 108.142E − 704.168E − 702.928E − 669.366E − 235 0 1.063E − 272 0
Ave2.206E + 011.131E + 044.086E − 114.068E − 711.464E − 711.010E − 673.248E − 236 0 3.935E − 274 0
Std5.407E + 007.239E + 034.038E − 111.574E − 707.601E − 715.438E − 67 0 0 0 0
Rank91086574131
F2Best2.295E + 011.120E − 011.379E − 074.706E − 554.940E − 57 0 1.226E − 127 0 9.989E − 215 0
Worst5.920E + 012.205E + 015.637E − 073.020E − 471.360E − 488.218E − 351.609E − 115 0 4.528E − 140 0
Ave3.901E + 017.362E + 002.368E − 071.053E − 481.017E − 493.229E − 365.982E − 117 0 1.509E − 141 0
Std9.152E + 006.263E + 008.618E − 085.508E − 483.266E − 491.515E − 352.945E − 116 0 8.268E − 141 0
Rank10986574131
F3Best1.196E + 041.540E + 052.370E + 014.152E + 055.261E + 05 0 9.631E − 256 0 0 0
Worst2.610E + 044.486E + 053.307E + 031.682E + 061.491E + 065.484E − 511.912E − 231 0 0 0
Ave1.660E + 042.370E + 051.037E + 031.007E + 061.040E + 061.906E − 526.898E − 233 0 0 0
Std3.793E + 036.543E + 048.578E + 023.146E + 052.624E + 051.018E − 51 0 0 0 0
Rank78691054111
F4Best9.191E + 008.607E + 017.311E − 012.040E + 011.568E + 002.570E − 2821.681E – 130 0 3.573E − 261 0
Worst1.493E + 019.584E + 018.765E + 009.697E + 019.638E + 017.474E − 324.501E – 118 0 4.421E − 158 0
Ave1.220E + 019.061E + 012.931E + 007.732E + 017.614E + 012.491E − 332.897E – 119 0 1.474E − 159 0
Std1.550E + 002.355E + 001.931E + 002.188E + 012.359E + 011.365E − 329.208E − 119 0 0 0
Rank71069854131
F5Best7.371E + 032.331E + 079.657E + 019.745E + 019.752E + 011.531E − 141.382E − 072.260E − 088.400E − 10 0
Worst3.114E + 042.554E + 089.849E + 019.858E + 019.846E + 013.712E − 032.409E − 033.776E − 041.007E − 03 0
Ave1.571E + 041.182E + 089.778E + 019.819E + 019.814E + 011.501E − 043.859E − 048.782E − 052.173E − 04 0
Std5.206E + 036.138E + 075.923E − 012.454E − 012.609E − 016.750E − 046.669E − 041.081E − 042.744E − 04 0
Rank91068735241
F6Best1.156E + 012.016E + 037.714E + 002.371E + 001.979E + 00 0 7.437E − 101.845E − 098.530E − 10 0
Worst3.269E + 013.267E + 041.146E + 018.063E + 006.289E + 00 2.874E − 06 1.995E − 054.954E − 061.287E − 047.775E − 03
Ave2.134E + 011.089E + 049.632E + 004.301E + 003.952E + 00 1.845E − 07 1.796E − 066.391E − 077.234E − 061.427E − 03
Std5.340E + 006.941E + 031.016E + 001.423E + 001.118E + 00 5.339E − 07 3.757E − 061.034E − 062.359E − 052.389E − 03
Rank91087613245
F7Best1.198E + 034.974E + 012.106E − 032.143E − 041.476E − 042.729E − 059.448E − 061.484E − 059.049E − 06 2.371E − 06
Worst2.001E + 034.241E + 021.629E − 021.567E − 022.330E − 021.299E − 031.096E − 033.918E − 041.621E − 03 3.207E − 04
Ave1.518E + 031.609E + 027.634E − 034.709E − 035.183E − 033.217E − 043.569E − 041.502E − 044.565E − 04 1.394E − 04
Std1.889E + 029.512E + 013.085E − 035.310E − 035.977E − 032.954E − 042.842E − 041.087E − 044.189E − 04 9.063E − 05
Rank10986734251
F8Best−1.781E + 04−7.934E + 03−1.999E + 04−4.187E + 04−4.190E + 04 −4.190E + 04 −2.698E + 04 −4.190E + 04 −4.190E + 04 −4.190E + 04
Worst−5.016E + 03−6.076E + 03−5.790E + 03−2.378E + 04−2.746E + 04−2.346E + 04−1.761E + 04−2.955E + 04−2.184E + 04 −3.456E + 04
Ave−1.050E + 04−6.759E + 03−1.623E + 04−3.407E + 04−3.358E + 04−3.150E + 04−2.276E + 04−3.604E + 04−3.379E + 04 −4.050E + 04
Std3.860E + 03 4.827E + 02 2.907E + 035.642E + 035.637E + 036.075E + 032.130E + 034.310E + 037.876E + 032.173E + 03
Rank91083567241
F9Best4.779E + 025.265E + 012.724E − 09 0 0 0 0 0 0 0
Worst7.998E + 024.466E + 024.533E + 01 0 1.137E − 13 0 0 0 0 0
Ave6.303E + 022.595E + 023.979E + 00 0 3.790E − 15 0 0 0 0 0
Std8.286E + 011.086E + 028.599E + 00 0 2.076E − 14 0 0 0 0 0
Rank10981711111
F10Best3.132E + 009.221E + 003.430E − 07 8.882E − 16 8.882E − 16 8.882E − 16 8.882E − 16 8.882E − 16 8.882E − 16 8.882E − 16
Worst4.305E + 002.067E + 011.375E − 067.994E − 157.994E − 15 8.882E − 16 8.882E − 16 8.882E − 16 8.882E − 16 8.882E − 16
Ave3.726E + 001.899E + 016.452E − 074.322E − 154.204E − 15 0 0 0 0 0
Std2.946E − 013.752E + 002.326E − 072.873E − 152.273E − 15 4.012E − 31 4.012E − 31 4.012E − 31 4.012E − 31 4.012E − 31
Rank91087611111
F11Best2.761E − 019.224E + 004.273E − 12 0 0 0 0 0 0 0
Worst6.430E − 011.889E + 022.807E − 023.137E − 01 0 0 0 0 0 0
Ave4.252E − 018.541E + 012.511E − 031.046E − 02 0 0 0 0 0 0
Std9.838E − 025.334E + 017.694E − 035.728E − 02 0 0 0 0 0 0
Rank91078111111
F12Best9.845E − 017.795E + 071.679E − 011.867E − 021.876E − 022.330E − 201.123E − 122.531E − 123.574E − 13 4.712E − 33
Worst9.589E + 006.179E + 083.925E − 019.377E − 021.440E − 01 8.878E − 09 6.918E − 081.562E − 071.101E − 089.480E − 05
Ave4.297E + 002.889E + 082.624E − 014.903E − 024.792E − 02 9.451E − 10 1.021E − 081.261E − 082.028E − 092.722E − 05
Std1.884E + 001.390E + 086.522E − 021.946E − 022.491E − 02 1.909E − 09 1.627E − 082.920E − 082.960E − 093.211E − 05
Rank91087613425
F13Best2.737E + 012.004E + 085.195E + 001.224E + 001.242E + 006.058E − 147.614E − 106.218E − 101.775E − 10 1.350E − 32
Worst9.778E + 011.593E + 096.920E + 005.151E + 005.166E + 004.220E − 073.989E − 061.629E − 066.166E − 06 1.350E − 32
Ave5.972E + 016.758E + 086.397E + 002.787E + 003.138E + 004.654E − 084.157E − 072.250E − 075.104E − 07 1.350E − 32
Std1.998E + 013.405E + 084.285E − 019.742E − 018.932E − 019.147E − 088.210E − 074.018E − 071.269E − 06 0
Rank91086724351
Average rank8.929.627.466.386.153.313.461.702.851.61
In order to better describe the optimization ability, difference, and convergence speed of the algorithm with other algorithms, the Wilcoxon rank sum test results with other algorithms is given in Table 7 according to ISSA, and the convergence diagram of each algorithm is given in Figure 8. At the same time, in order to test the contribution of the four strategies to the algorithm, this paper selects a basic function to test the four components of the improved algorithm, as shown in the first figure in Figure 8. The above is based on the test results of 23 benchmark functions.
Table 7

Wilcoxon rank sum test results of each algorithm (30 dimensions).

F PSOSCAGWOWOAMWOASSABSSACSSALSSA
F11.212E − 121.212E − 121.212E − 121.212E − 121.212E − 12NaN1.212E − 12NaNNaN
F21.212E − 121.212E − 121.212E − 121.212E − 121.212E − 124.057E − 031.212E − 12NaN1.212E − 12
F31.212E − 121.212E − 121.212E − 121.212E − 121.212E − 125.772E − 111.212E − 12NaN4.714E − 05
F41.212E − 121.212E − 121.212E − 121.212E − 121.212E − 121.212E − 121.212E − 124.574E − 121.212E − 12
F51.212E − 121.212E − 121.212E − 121.212E − 121.211E − 121.212E − 121.212E − 121.212E − 121.212E − 12
F61.167E − 022.954E − 112.814E − 102.954E − 112.954E − 111.957E − 043.962E − 043.540E − 043.962E − 04
F73.020E − 113.020E − 114.504E − 119.063E − 084.616E − 101.023E − 019.626E − 027.062E − 017.483E − 02
F82.392E − 112.392E − 112.392E − 112.831E − 073.102E − 081.783E − 062.392E − 113.327E − 094.347E − 08
F91.212E − 121.212E − 123.818E − 12NaN3.337E − 01NaNNaNNaNNaN
F101.211E − 121.212E − 121.199E − 129.318E − 081.317E − 09NaNNaNNaNNaN
F111.212E − 121.212E − 126.617E − 043.337E − 01NaNNaNNaNNaNNaN
F121.259E − 012.982E − 112.982E − 112.982E − 112.982E − 119.460E − 069.460E − 069.460E − 062.420E − 05
F131.211E − 121.212E − 121.212E − 121.212E − 121.212E − 121.212E − 121.212E − 121.212E − 121.212E − 12
F144.126E − 073.800E − 091.642E − 052.328E − 043.743E − 055.429E − 081.058E − 101.608E − 014.181E − 01
F155.992E − 112.982E − 118.620E − 091.081E − 104.027E − 119.080E − 031.598E − 013.241E − 063.455E − 03
F161.685E − 143.620E − 131.685E − 141.685E − 141.685E − 142.708E − 141.685E − 141.685E − 141.685E − 14
F17NaN4.566E − 123.337E − 016.519E − 046.500E − 05NaNNaNNaNNaN
F18NaNNaNNaNNaNNaN1.607E − 01NaNNaNNaN
F19NaN1.129E − 122.158E − 021.828E − 095.600E − 10NaNNaNNaNNaN
F206.138E − 016.738E − 112.457E − 021.021E − 014.245E − 041.000E + 001.189E − 015.271E − 053.055E − 01
F211.173E − 051.212E − 121.101E − 021.878E − 095.404E − 11NaNNaNNaNNaN
F224.721E − 031.720E − 121.000E + 001.704E − 121.715E − 122.416E − 041.039E − 103.337E − 013.337E − 01
F232.157E − 021.212E − 121.608E − 011.208E − 121.208E − 122.773E − 035.808E − 09NaNNaN
Figure 8

Convergence diagram of each algorithm.

In this paper, the four strategies and the original SSA and ISSA are tested in the F1 function of the basic test function, as shown in the first figure in Figure 8. ISSA1, ISSA2, ISSA3, and ISSA4 represent the algorithms improved separately by the four improved strategies proposed before. Among them, ISSA2, ISSA3, and ISSA have found the theoretical optimal value. Compared with the original SSA, ISSA2 has significantly improved the convergence speed, reflecting its excellent search ability; ISSA3 suddenly converges to the theoretical optimal value in the middle of the iteration, indicating its excellent ability to jump out of the local optimal. Compared with the original SSA algorithm, the optimization results and convergence speed of ISSA1 and ISSA4 are improved by dozens of orders of magnitude, which can improve to a certain extent. It can be seen that improved iterative local search and dimension by dimension lens imaging learning play a more critical role in improving the ability of the algorithm. The combination of the above four strategies enables us to obtain an ISSA with faster convergence speed, more accurate results, and more stability. In the Wilcoxon rank sum test, when the value is less than 0.05, it can be considered that there is a significant difference between the two. In Table 7, NaN indicates that their performance is equivalent and cannot be compared. It can be seen that most values are less than 0.05, indicating that the optimization performance of ISSA is significantly different from other algorithms, among which the difference between ISSA and CSSA is the smallest, followed by LSSA, BSSA, and SSA. As shown in Figure 8, the ISSA shows excellent optimization speed and convergence accuracy, and the convergence speed is fastest in most functions. The ISSA converges faster in the single-peak benchmark function and has better ability to resist the attraction local optimum in the multipeak benchmark function.

5. CEC 2017 Function Test

In order to better illustrate the generality and effectiveness of the algorithm and avoid that the ISSA is only applicable to the case where the optimal value is 0, the algorithm is tested on the CEC 2017 test function. The evaluation times are 10000∗dim, the number of population individuals is 30, the dimension is 30, SD is set to 0.6, and other parameters remain unchanged. In this paper, the above algorithms are run independently for 30 times, and five indexes of each algorithm are calculated according to the results, namely, the best value (Best), the worst value (Worst), the median (Med), the average value (Ave), and the standard deviation (Std). Finally, each algorithm will be ranked according to the average value of the algorithm in the function. The optimal value of each index is treated in bold. Due to the defects of F2 function, it will not be tested in this paper. The specific test results are shown in Table 8. At the same time, six functions F4, F7, F14, F17, F24, and F27 are selected to draw the box diagram of the results, as shown in Figure 9.
Table 8

Test results of each algorithm in CEC 2017.

FIndexPSOSCAGWOWOAMWOASSABSSACSSALSSAISSA
F1Best1.52E + 029.06E + 095.10E + 084.33E + 064.28E + 091.09E + 021.65E + 071.23E + 021.06E + 021.00E + 02
Worst9.74E + 031.84E + 103.22E + 091.17E + 081.04E + 101.99E + 041.66E + 081.20E + 042.08E + 04 9.24E + 03
Med1.68E + 031.19E + 101.05E + 092.06E + 076.36E + 096.79E + 028.20E + 071.48E + 035.36E + 03 5.23E + 02
Ave2.53E + 031.20E + 101.35E + 092.99E + 076.43E + 093.47E + 039.87E + 073.18E + 038.01E + 03 1.78E + 03
Std2.61E + 032.06E + 097.53E + 082.74E + 071.21E + 095.82E + 034.14E + 073.29E + 037.96E + 03 2.32E + 03
Rank21086947351
F3Best1.95E + 032.30E + 042.87E + 048.52E + 044.82E + 041.26E + 044.67E + 045.46E + 032.15E + 04 3.07E + 02
Worst1.19E + 044.89E + 046.27E + 043.02E + 056.49E + 042.98E + 046.87E + 041.44E + 044.99E + 04 7.66E + 02
Med5.12E + 033.46E + 045.06E + 041.89E + 055.56E + 041.67E + 045.67E + 041.02E + 043.30E + 04 4.90E + 02
Ave5.49E + 033.39E + 044.92E + 042.00E + 055.53E + 041.78E + 045.69E + 049.94E + 033.21E + 04 4.64E + 02
Std1.58E + 038.48E + 031.15E + 047.55E + 043.44E + 034.49E + 038.03E + 032.35E + 037.20E + 03 1.20E + 02
Rank26710849351
F4Best4.18E + 028.86E + 025.20E + 025.03E + 021.23E + 034.59E + 025.10E + 024.59E + 024.08E + 02 4.04E + 02
Worst 4.82E + 02 1.89E + 036.49E + 026.48E + 022.73E + 035.38E + 026.93E + 025.35E + 025.40E + 026.09E + 02
Med4.79E + 021.53E + 036.18E + 025.58E + 022.13E + 035.10E + 025.80E + 024.88E + 02 4.72E + 02 4.77E + 02
Ave 4.63E + 02 1.48E + 036.09E + 025.61E + 022.08E + 035.08E + 025.78E + 024.83E + 024.82E + 024.66E + 02
Std2.67E + 012.49E + 023.67E + 014.26E + 014.82E + 021.79E + 015.62E + 01 1.56E + 01 2.91E + 014.15E + 01
Rank19861057432
F5Best6.31E + 027.39E + 02 5.53E + 02 7.05E + 027.99E + 027.34E + 025.91E + 026.56E + 026.21E + 026.49E + 02
Worst7.30E + 028.27E + 027.47E + 029.26E + 029.37E + 028.20E + 02 6.70E + 02 8.07E + 028.21E + 028.16E + 02
Med7.00E + 027.62E + 02 5.94E + 02 7.57E + 028.49E + 028.15E + 026.45E + 027.40E + 027.18E + 027.80E + 02
Ave6.90E + 027.68E + 02 6.03E + 02 7.80E + 028.69E + 028.08E + 026.34E + 027.40E + 027.32E + 027.62E + 02
Std3.06E + 01 1.73E + 01 3.50E + 017.36E + 014.91E + 011.79E + 012.40E + 014.57E + 015.45E + 013.41E + 01
Rank37181092546
F6Best6.34E + 026.42E + 026.38E + 026.61E + 026.65E + 026.57E + 026.08E + 026.21E + 026.22E + 02 6.02E + 02
Worst6.57E + 026.61E + 026.68E + 026.83E + 026.88E + 026.78E + 026.65E + 026.57E + 026.58E + 02 6.12E + 02
Med6.45E + 026.50E + 026.57E + 026.70E + 026.76E + 026.64E + 026.29E + 026.27E + 026.41E + 02 6.11E + 02
Ave6.46E + 026.49E + 026.56E + 026.72E + 026.74E + 026.64E + 026.38E + 026.33E + 026.41E + 02 6.08E + 02
Std7.80E + 005.44E + 006.90E + 008.86E + 005.46E + 006.39E + 002.16E + 011.08E + 011.02E + 01 4.37E + 00
Rank56791083241
F7Best1.07E + 031.08E + 038.20E + 021.06E + 031.19E + 031.18E + 038.74E + 029.28E + 028.64E + 02 7.99E + 02
Worst1.37E + 031.18E + 039.91E + 021.44E + 031.41E + 031.35E + 031.33E + 031.34E + 031.33E + 03 8.76E + 02
Med1.32E + 031.10E + 038.91E + 021.24E + 031.27E + 031.33E + 039.82E + 021.17E + 031.14E + 03 8.51E + 02
Ave1.29E + 031.11E + 038.89E + 021.28E + 031.29E + 031.32E + 031.00E + 031.17E + 031.14E + 03 8.46E + 02
Std6.72E + 012.99E + 014.61E + 011.15E + 027.20E + 013.85E + 018.42E + 011.09E + 021.44E + 02 1.66E + 01
Rank84279103651
F8Best8.74E + 021.02E + 039.08E + 029.42E + 021.04E + 039.38E + 029.15E + 028.60E + 028.84E + 02 8.55E + 02
Worst9.62E + 021.09E + 031.01E + 031.08E + 031.11E + 031.05E + 039.79E + 029.82E + 021.01E + 03 9.14E + 02
Med9.15E + 021.04E + 039.52E + 029.72E + 021.09E + 039.92E + 029.59E + 029.43E + 029.66E + 02 8.80E + 02
Ave9.21E + 021.05E + 039.57E + 029.89E + 021.08E + 039.94E + 029.53E + 029.48E + 029.58E + 02 8.82E + 02
Std2.58E + 012.28E + 012.31E + 013.69E + 011.63E + 012.56E + 011.44E + 012.79E + 013.98E + 01 1.37E + 01
Rank29571084361
F9Best2.45E + 033.74E + 033.57E + 033.60E + 036.34E + 034.01E + 032.04E + 033.08E + 035.06E + 03 1.06E + 03
Worst5.01E + 037.85E + 035.79E + 031.35E + 041.15E + 045.58E + 039.94E + 035.44E + 035.56E + 03 1.85E + 03
Med3.99E + 034.43E + 034.97E + 036.12E + 038.81E + 035.41E + 035.85E + 035.35E + 035.39E + 03 1.44E + 03
Ave3.84E + 034.95E + 034.95E + 036.82E + 038.73E + 035.30E + 036.02E + 034.88E + 035.37E + 03 1.42E + 03
Std5.34E + 021.09E + 034.57E + 021.82E + 031.34E + 033.75E + 022.32E + 038.19E + 02 1.23E + 02 1.87E + 02
Rank24591068371
F10Best3.71E + 037.33E + 03 3.15E + 03 5.34E + 035.74E + 034.57E + 034.82E + 034.09E + 033.91E + 034.06E + 03
Worst5.45E + 038.48E + 03 5.06E + 03 8.13E + 038.59E + 039.10E + 039.80E + 036.33E + 036.53E + 038.89E + 03
Med4.27E + 038.17E + 03 4.20E + 03 7.29E + 037.34E + 035.95E + 036.57E + 035.39E + 034.58E + 036.24E + 03
Ave4.47E + 038.13E + 03 4.05E + 03 7.00E + 037.54E + 036.43E + 036.71E + 035.42E + 034.83E + 036.09E + 03
Std5.13E + 02 2.79E + 02 6.38E + 029.77E + 026.40E + 021.63E + 031.60E + 035.13E + 026.99E + 021.09E + 03
Rank21018967435
F11Best1.16E + 031.71E + 031.34E + 031.47E + 032.57E + 031.19E + 031.31E + 031.16E + 031.25E + 03 1.14E + 03
Worst 1.25E + 03 3.73E + 032.12E + 032.49E + 035.28E + 031.39E + 031.88E + 031.32E + 031.59E + 031.29E + 03
Med1.19E + 031.90E + 031.44E + 032.30E + 034.37E + 031.23E + 031.43E + 031.26E + 031.44E + 03 1.19E + 03
Ave 1.19E + 03 2.02E + 031.59E + 032.12E + 034.23E + 031.25E + 031.45E + 031.25E + 031.43E + 031.19E + 03
Std 2.46E + 01 4.01E + 022.58E + 023.84E + 026.64E + 024.68E + 011.36E + 024.06E + 019.29E + 013.12E + 01
Rank18791036452
F12Best 2.20E + 04 6.42E + 084.00E + 061.03E + 076.90E + 083.78E + 049.49E + 051.09E + 051.25E + 052.90E + 04
Worst1.25E + 061.99E + 093.22E + 081.08E + 082.59E + 091.13E + 071.22E + 07 6.57E + 05 9.09E + 051.91E + 07
Med3.54E + 058.82E + 083.93E + 076.62E + 079.19E + 082.07E + 062.91E + 06 2.08E + 05 4.45E + 051.54E + 06
Ave4.22E + 051.03E + 096.63E + 076.01E + 071.11E + 092.45E + 064.41E + 06 2.23E + 05 4.57E + 052.11E + 06
Std3.55E + 054.03E + 088.44E + 072.84E + 075.40E + 082.66E + 063.09E + 06 1.11E + 05 2.80E + 053.55E + 06
Rank29871056134
F13Best2.90E + 032.13E + 083.83E + 043.85E + 046.22E + 073.05E + 038.09E + 032.97E + 035.95E + 03 1.68E + 03
Worst5.62E + 046.13E + 081.40E + 082.82E + 054.32E + 087.25E + 044.51E + 067.12E + 045.45E + 04 1.96E + 04
Med 8.39E + 03 3.48E + 081.84E + 051.22E + 051.53E + 081.39E + 042.34E + 049.69E + 032.75E + 049.36E + 03
Ave1.21E + 043.69E + 083.90E + 071.19E + 051.63E + 082.56E + 045.41E + 051.38E + 042.92E + 04 9.11E + 03
Std1.07E + 049.87E + 075.71E + 075.05E + 048.14E + 072.50E + 041.36E + 061.58E + 042.09E + 04 4.37E + 03
Rank21086947351
F14Best4.96E + 034.08E + 042.93E + 031.37E + 041.41E + 065.09E + 037.46E + 031.67E + 032.50E + 03 1.66E + 03
Worst1.12E + 052.71E + 058.84E + 057.05E + 064.40E + 061.64E + 051.79E + 067.66E + 041.09E + 05 5.18E + 04
Med2.71E + 048.97E + 044.63E + 041.11E + 062.68E + 063.07E + 043.35E + 051.39E + 042.86E + 04 5.97E + 03
Ave3.93E + 041.25E + 052.31E + 051.86E + 062.54E + 063.37E + 044.24E + 051.69E + 043.86E + 04 1.52E + 04
Std2.72E + 046.22E + 042.90E + 051.72E + 069.81E + 052.81E + 044.18E + 051.72E + 043.18E + 04 1.71E + 04
Rank56791038241
F15Best1.65E + 036.16E + 051.39E + 042.15E + 043.22E + 062.08E + 032.30E + 031.85E + 031.86E + 03 1.57E + 03
Worst2.12E + 045.07E + 075.42E + 051.83E + 055.30E + 072.29E + 042.04E + 04 1.50E + 04 4.43E + 043.46E + 04
Med4.05E + 035.21E + 061.03E + 054.40E + 047.88E + 065.81E + 036.81E + 033.46E + 032.92E + 04 2.38E + 03
Ave5.39E + 031.11E + 071.12E + 056.36E + 041.36E + 077.88E + 036.92E + 03 4.75E + 03 2.35E + 044.92E + 03
Std5.43E + 031.08E + 071.23E + 054.40E + 041.44E + 076.35E + 033.96E + 03 3.14E + 03 1.80E + 046.67E + 03
Rank39871054162
F16Best2.24E + 033.37E + 031.98E + 032.93E + 033.86E + 032.83E + 032.19E + 032.42E + 032.33E + 03 1.88E + 03
Worst 3.06E + 03 4.09E + 033.12E + 034.56E + 035.69E + 038.55E + 033.60E + 033.30E + 033.33E + 035.57E + 03
Med2.85E + 033.66E + 03 2.77E + 03 3.33E + 034.15E + 033.79E + 032.81E + 033.10E + 032.89E + 033.23E + 03
Ave2.80E + 033.68E + 03 2.56E + 03 3.44E + 034.31E + 034.12E + 032.84E + 033.00E + 032.87E + 033.35E + 03
Std2.70E + 02 1.55E + 02 4.02E + 024.03E + 024.33E + 021.23E + 033.20E + 022.71E + 022.51E + 021.09E + 03
Rank28171093546
F17Best1.94E + 032.11E + 032.07E + 032.15E + 032.22E + 031.99E + 031.96E + 032.05E + 032.13E + 03 1.83E + 03
Worst2.75E + 032.69E + 033.81E + 033.22E + 033.15E + 033.36E + 032.92E + 033.01E + 033.02E + 03 2.25E + 03
Med2.32E + 032.30E + 033.08E + 032.56E + 032.95E + 032.68E + 032.40E + 032.44E + 032.65E + 03 1.99E + 03
Ave2.33E + 032.37E + 033.00E + 032.65E + 032.89E + 032.81E + 032.42E + 032.46E + 032.58E + 03 2.01E + 03
Std2.55E + 02 1.44E + 02 4.67E + 022.65E + 022.49E + 023.40E + 022.45E + 022.11E + 022.25E + 021.48E + 02
Rank23107984561
F18Best5.23E + 043.20E + 054.38E + 042.75E + 052.86E + 061.19E + 045.88E + 043.78E + 041.09E + 05 6.74E + 03
Worst5.93E + 056.66E + 061.06E + 078.14E + 062.41E + 071.54E + 067.28E + 064.14E + 051.47E + 06 1.74E + 05
Med3.46E + 051.48E + 064.59E + 052.96E + 068.73E + 067.18E + 045.67E + 058.42E + 042.53E + 05 4.00E + 04
Ave2.74E + 051.83E + 061.55E + 062.81E + 068.77E + 061.80E + 051.79E + 061.43E + 053.56E + 05 7.38E + 04
Std1.84E + 051.65E + 062.78E + 062.01E + 063.77E + 063.80E + 052.81E + 061.22E + 053.37E + 05 6.25E + 04
Rank48691037251
F19Best2.04E + 034.15E + 062.06E + 049.08E + 056.49E + 062.53E + 032.57E + 032.17E + 032.36E + 03 2.02E + 03
Worst2.18E + 043.99E + 074.22E + 061.30E + 076.12E + 071.63E + 042.10E + 04 9.56E + 03 5.64E + 042.64E + 04
Med 3.82E + 03 2.04E + 075.12E + 053.18E + 061.39E + 075.51E + 034.44E + 035.15E + 036.78E + 036.49E + 03
Ave 5.54E + 03 2.38E + 079.64E + 054.52E + 062.01E + 076.91E + 037.75E + 035.63E + 031.09E + 047.79E + 03
Std4.49E + 039.96E + 061.34E + 064.12E + 061.30E + 073.76E + 036.79E + 03 2.43E + 03 1.36E + 045.90E + 03
Rank11078934265
F20Best2.26E + 032.38E + 032.27E + 032.38E + 032.49E + 032.39E + 03 2.14E + 03 2.29E + 032.30E + 032.15E + 03
Worst2.89E + 03 2.73E + 03 3.39E + 033.15E + 032.99E + 033.43E + 033.22E + 032.95E + 032.91E + 032.77E + 03
Med2.44E + 032.58E + 033.04E + 032.80E + 032.78E + 032.75E + 032.66E + 032.54E + 032.71E + 03 2.27E + 03
Ave2.49E + 032.59E + 032.97E + 032.74E + 032.78E + 032.84E + 032.60E + 032.52E + 032.69E + 03 2.30E + 03
Std1.63E + 02 8.77E + 01 2.43E + 022.01E + 021.16E + 022.39E + 021.92E + 021.83E + 021.27E + 021.30E + 02
Rank24107895361
F21Best2.39E + 032.52E + 032.42E + 032.50E + 032.56E + 032.49E + 032.40E + 03 2.20E + 03 2.42E + 032.36E + 03
Worst2.51E + 032.60E + 032.68E + 032.68E + 032.74E + 032.73E + 032.49E + 032.58E + 032.58E + 03 2.46E + 03
Med2.45E + 032.53E + 032.54E + 032.57E + 032.63E + 032.62E + 032.43E + 032.49E + 032.46E + 03 2.39E + 03
Ave2.45E + 032.54E + 032.53E + 032.58E + 032.63E + 032.59E + 032.44E + 032.48E + 032.48E + 03 2.41E + 03
Std2.32E + 01 1.77E + 01 5.36E + 015.01E + 013.40E + 015.86E + 013.25E + 016.21E + 014.72E + 013.81E + 01
Rank37681092451
F22Best 2.30E + 03 3.96E + 032.50E + 032.33E + 033.59E + 032.38E + 032.34E + 03 2.30E + 03 2.30E + 03 2.38E + 03
Worst 6.81E + 03 9.97E + 031.00E + 048.80E + 031.03E + 041.04E + 048.79E + 037.87E + 037.58E + 039.42E + 03
Med5.86E + 039.60E + 034.99E + 037.17E + 034.34E + 037.44E + 032.38E + 03 2.30E + 03 5.73E + 037.31E + 03
Ave4.91E + 039.00E + 034.48E + 036.68E + 036.02E + 037.56E + 03 3.30E + 03 3.34E + 035.46E + 037.27E + 03
Std1.90E + 031.71E + 031.98E + 031.76E + 032.55E + 031.42E + 032.12E + 031.94E + 032.08E + 03 1.30E + 03
Rank41037691258
F23Best 2.70E + 03 2.95E + 032.71E + 032.93E + 032.99E + 032.99E + 032.76E + 032.83E + 032.78E + 032.85E + 03
Worst3.36E + 033.05E + 032.95E + 033.29E + 033.25E + 033.57E + 03 2.91E + 03 3.06E + 032.95E + 033.59E + 03
Med3.15E + 032.97E + 03 2.75E + 03 3.03E + 033.05E + 033.33E + 032.83E + 032.92E + 032.82E + 033.31E + 03
Ave3.16E + 032.98E + 03 2.77E + 03 3.02E + 033.10E + 033.33E + 032.82E + 032.93E + 032.83E + 033.26E + 03
Std1.45E + 02 3.02E + 01 5.08E + 019.83E + 019.88E + 011.35E + 024.96E + 016.17E + 014.42E + 011.70E + 02
Rank85167102439
F24Best3.07E + 033.13E + 033.11E + 033.00E + 033.13E + 033.19E + 032.91E + 032.98E + 032.93E + 03 2.87E + 03
Worst3.40E + 033.20E + 033.77E + 033.36E + 033.31E + 033.69E + 03 3.06E + 03 3.37E + 033.10E + 033.08E + 03
Med3.23E + 033.17E + 033.34E + 033.19E + 033.20E + 033.34E + 033.01E + 033.13E + 033.01E + 03 2.91E + 03
Ave3.24E + 033.17E + 033.40E + 033.17E + 033.21E + 033.33E + 032.99E + 033.14E + 033.01E + 03 2.93E + 03
Std1.33E + 02 1.72E + 01 1.94E + 027.64E + 015.73E + 011.05E + 024.18E + 011.03E + 024.47E + 015.27E + 01
Rank85106792431
F25Best2.88E + 033.11E + 032.92E + 032.91E + 033.30E + 032.88E + 032.90E + 032.88E + 032.88E + 03 2.88E + 03
Worst2.90E + 033.36E + 033.03E + 033.05E + 033.55E + 032.95E + 033.02E + 032.94E + 032.89E + 032.95E + 03
Med 2.88E + 03 3.22E + 032.98E + 032.93E + 033.39E + 032.90E + 032.96E + 032.89E + 032.89E + 032.89E + 03
Ave 2.88E + 03 3.23E + 032.98E + 032.95E + 033.38E + 032.91E + 032.97E + 032.90E + 032.89E + 032.90E + 03
Std3.67E + 006.72E + 013.00E + 013.94E + 015.73E + 012.12E + 013.70E + 011.78E + 011.51E + 002.86E + 01
Rank19861057324
F26Best5.76E + 036.49E + 034.29E + 036.03E + 037.32E + 033.01E + 033.55E + 032.90E + 032.80E + 03 2.80E + 03
Worst8.91E + 037.27E + 03 5.10E + 03 9.96E + 031.04E + 041.21E + 047.60E + 037.55E + 036.22E + 031.09E + 04
Med6.60E + 036.97E + 03 4.63E + 03 7.58E + 038.11E + 039.02E + 035.23E + 035.60E + 035.13E + 036.32E + 03
Ave7.08E + 036.92E + 03 4.64E + 03 7.58E + 038.29E + 038.76E + 035.23E + 035.65E + 034.67E + 036.69E + 03
Std1.06E + 031.75E + 02 1.51E + 02 6.53E + 026.15E + 021.80E + 037.03E + 021.07E + 031.35E + 032.08E + 03
Rank76189103425
F27Best 3.17E + 03 3.34E + 033.22E + 033.27E + 033.38E + 033.33E + 033.22E + 033.23E + 033.22E + 033.20E + 03
Worst3.42E + 033.49E + 033.33E + 033.68E + 034.00E + 034.14E + 033.40E + 033.41E + 033.35E + 03 3.20E + 03
Med 3.19E + 03 3.41E + 033.25E + 033.33E + 033.60E + 033.80E + 033.28E + 033.30E + 033.23E + 033.20E + 03
Ave3.21E + 033.41E + 033.26E + 033.41E + 033.61E + 033.71E + 033.29E + 033.32E + 033.25E + 03 3.20E + 03
Std6.02E + 013.97E + 012.94E + 011.39E + 021.26E + 022.21E + 026.50E + 016.10E + 014.13E + 01 1.09E − 04
Rank27489105631
F28Best3.10E + 033.65E + 033.30E + 033.27E + 033.66E + 033.20E + 033.26E + 033.10E + 033.12E + 03 3.10E + 03
Worst3.27E + 034.12E + 033.62E + 033.40E + 034.26E + 033.29E + 033.37E + 033.26E + 03 3.26E + 03 3.30E + 03
Med3.21E + 033.82E + 033.41E + 033.33E + 033.87E + 033.22E + 033.33E + 03 3.11E + 03 3.21E + 033.30E + 03
Ave3.21E + 033.85E + 033.40E + 033.33E + 033.95E + 033.22E + 033.33E + 03 3.15E + 03 3.22E + 033.28E + 03
Std4.84E + 011.55E + 027.88E + 012.67E + 011.72E + 02 2.45E + 01 3.22E + 016.51E + 013.33E + 014.67E + 01
Rank29861037145
F29Best3.60E + 034.24E + 033.94E + 034.17E + 034.73E + 034.27E + 033.59E + 03 3.42E + 03 3.52E + 033.58E + 03
Worst4.30E + 035.05E + 038.78E + 035.73E + 036.02E + 037.31E + 034.26E + 034.82E + 034.40E + 03 4.06E + 03
Med3.95E + 034.53E + 034.92E + 034.97E + 035.11E + 035.24E + 034.16E + 034.09E + 034.05E + 03 3.78E + 03
Ave3.94E + 034.59E + 035.22E + 034.88E + 035.22E + 035.27E + 034.06E + 034.27E + 033.97E + 03 3.79E + 03
Std1.63E + 022.42E + 029.10E + 023.98E + 023.54E + 026.08E + 021.77E + 024.26E + 022.14E + 02 1.27E + 02
Rank26879104531
F30Best5.40E + 034.89E + 071.44E + 064.12E + 062.32E + 071.21E + 041.11E + 045.82E + 036.80E + 03 3.61E + 03
Worst 1.07E + 04 1.15E + 083.20E + 074.34E + 073.98E + 086.06E + 063.05E + 051.88E + 042.42E + 041.29E + 05
Med 6.25E + 03 7.19E + 077.15E + 061.34E + 071.38E + 083.41E + 044.66E + 041.05E + 041.97E + 041.25E + 04
Ave 6.76E + 03 7.64E + 071.60E + 071.49E + 071.54E + 081.25E + 065.65E + 041.06E + 041.72E + 041.68E + 04
Std 1.69E + 03 1.76E + 071.35E + 071.22E + 079.75E + 072.45E + 065.64E + 043.59E + 036.03E + 032.37E + 04
Rank19871065243
Average rank3.077.346.007.419.206.664.903.314.352.80
Figure 9

Box diagram of each algorithm.

It can be seen from the data in Table 8 that, in the 29 functions, ISSA ranks first in most functions and its average ranking is better than other algorithms. ISSA has good optimization effect and can be close to the theoretical optimal value of each function. In F1, F3, F6, F7, F8, F13,F14, F18, and F29, each index of ISSA is the best in the algorithm. Among the 20 functions, ISSA finds the optimal value of all algorithms, which shows that ISSA has strong optimization ability. When the optimal solution is not 0, ISSA shows better optimization performance than SSA, and PSO and CSSA also show good performance. As can be seen from the box diagram in Figure 9, the ISSA has strong search ability and is closest to the theoretical optimal value among all algorithms. And ISSA's box graph has shorter length and stronger stability than other algorithms. Among them, the realization of SSA is poor because the individual in the SSA is directly jumping when approaching the current optimal solution, rather than moving to the current optimal solution like PSO. This problem leads to the rapid convergence of SSA, but it is easy to miss the high-quality solution and fall into local optimization. The ISSA uses the improved strategy to make up for this disadvantage, makes full use of the current solution, ensures the convergence speed, and increases its ability to jump out of the local optimum. But overall, the ISSA has better optimization performance than other algorithms, has good universality and effectiveness, and can adapt to some complex optimization problems.

6. PID Parameter Tuning

A PID controller is the most widely used controller in the industry (accounting for about 90% of the controller). The PID controller is composed of three basic gain parameters to control the controlled object. It is mainly applicable to the system whose basic linear and dynamic characteristics do not change with time. Its structure is shown in Figure 10. When the proportional parameter K increases, the rise time and steady-state error decrease. When the integral parameter K increases, the rise time is smaller, but the stability time and overshoot increase. The negative effect of the K increase can be overcome by adjusting the differential parameter K. The relationship between output and input of the PID controller is as follows:
Figure 10

PID controller structure diagram.

Manual PID parameter tuning is a time-consuming process. Generally, it is tried through the experience and skills of engineers and the intelligent algorithm can complete the parameter tuning in a short time. In order to verify the practicability of ISSA, this paper uses ISSA to optimize PID parameters, simulates under unit step response and sinusoidal input response respectively, and tests with SSA to prove its optimization performance. In this paper, the objective function [39] is set as follows:e(t) is the error between the input value and the output value. Considering the dynamic characteristics of the iterative process, the integral of its absolute value is adopted; u(t) is the control value, which is added to avoid excessive control range; w1 and w2 are weights, and the value range is [0, 1]. In addition, measures shall be taken to prevent overshoot, that is, when overshoot occurs, an additional overshoot item shall be introduced into the objective function. At this time, the settings are as follows:where w3 is the weight and w3 ≫ w1. Generally, w1=0.999, w2=0.001, w3=100. Therefore, the goal of ISSA is to find a set of PID parameters to minimize the error of objective function. In this paper, the number of individuals and the number of iterations of the population are set to be 30 and 50, respectively. Other parameters are consistent with those of Table 2. The test is carried out under the condition of unit leap forward and sinusoidal input respectively and run independently for 10 times. The test results are shown in Figure 11, and the optimization results are shown in Table 9.
Figure 11

Convergence curve and PID control output.

Table 9

PID parameter setting results.

AlgorithmUnit stepSinusoidal input
Fitness K p K i K d Fitness K p K i K d
SSA29.1777100.2210460.12689453.082710101.90441
ISSA 22.7045 41.45930.5963640.402723 50.5554 286.648621.191530.898048
Since the results of 10 independent runs are consistent, only one is shown here. It can be seen from Figure 11 and Table 9 that in the unit step response, ISSA can complete the parameter setting in a very short time, and the convergence speed and accuracy are better than SSA. In the sinusoidal input response, ISSA can almost coincide with rin, and the tracking effect is significant. The tracking effect of SSA is slightly inferior. At the same time, the convergence speed and accuracy of ISSA are slightly better than SSA. The above results verify that ISSA has good algorithm performance, can quickly and accurately complete the PID parameter tuning, and help the system have shorter response time, higher system control accuracy, and better robustness. So far, the practicability has been proved.

7. Robot Path Planning

In PID parameter tuning, the dimension of the practical application is low. Therefore, this paper selects the discrete problem of more complex path planning to further verify the practicability of ISSA and makes a comparative experiment with SSA. In path planning, each sparrow is a feasible path. The environment modeling adopts the grid method, and the obstacles at the equivalent position are calculated according to the grid value. Grid number 0 is defined as feasible area and 1 as obstacle area. Then, the robot can plan the path on the grid specified as 0, and dimension D is the column number of the grid map. The cost function of the path length of the i-th sparrow is shown in In equation (31), j is the j-th dimension of a sparrow. Set each algorithm at 15 × 15, and the best path is shown in Figure 12. In order to eliminate the contingency, each algorithm is tested for 10 times, the optimal value, worst value, average value, and standard deviation of the fitness value of each algorithm are calculated, and these four indexes are used to measure the stability and feasibility of each algorithm. The optimization statistics of each algorithm are shown in Table 10.
Figure 12

Convergence curve and optimal path.

Table 10

Robot path planning results.

AlgorithmFitness
BestWorstAveStd
SSA22.627439.598029.98136.5115
ISSA 19.7990 28.2843 22.6274 3.4641
It can be seen from the Table 10 and Figure 12, the minimum cost of ISSA planning is 19.7990, while the minimum cost of SSA is 22.6274. It can be seen that ISSA has strong path planning ability. According to other indicators, it can be seen that the path planned by ISSA has good stability. Therefore, ISSA has a good effect in more complex robot path planning and can plan a more stable and safer path.

8. Conclusions

This paper proposes an improved sparrow search algorithm based on iterative local search strategy, which introduces four strategies: variable helix factor, improved iterative local search, the lens imaging with changing focusing ability, and improved boundary control. ISSA overcomes the shortcomings of poor utilization of current individuals and lack of effective search and effectively improves the problems of falling into local optimal solution and low optimization accuracy. The test function results show that ISSA has good optimization performance and universality. The results of PID parameter tuning and robot path planning show that ISSA algorithm has good practicability. The improved ISSA has good optimization performance, but it also has some shortcomings. For example, it can only find the optimal value in some functions, and other performance indicators are poor and unstable; a certain amount of work is added, resulting in a longer consumption time of the algorithm; and it did not improve the search scope of discoverers. In view of the shortcomings, we still need to do some work in the future: first, how to improve the stability of the algorithm; second, how to improve the search ability of followers; third, how to balance the time and optimization ability of the algorithm; and fourth, how to improve the search ability of discoverers on the basis of discoverers.
  4 in total

1.  A Self-Adaptive Differential Evolution Algorithm for Scheduling a Single Batch-Processing Machine With Arbitrary Job Sizes and Release Times.

Authors:  Shengchao Zhou; Lining Xing; Xu Zheng; Ni Du; Ling Wang; Qingfu Zhang
Journal:  IEEE Trans Cybern       Date:  2021-02-17       Impact factor: 11.448

  4 in total
  4 in total

Review 1.  Improved Sparrow Algorithm Based on Game Predatory Mechanism and Suicide Mechanism.

Authors:  Ping Yang; Shaoqiang Yan; Donglin Zhu; Jiangpeng Wang; Fengxuan Wu; Zhe Yan; Song Yan
Journal:  Comput Intell Neurosci       Date:  2022-05-16

2.  A Multistrategy-Integrated Learning Sparrow Search Algorithm and Optimization of Engineering Problems.

Authors:  Zikai Wang; Xueyu Huang; Donglin Zhu
Journal:  Comput Intell Neurosci       Date:  2022-02-23

3.  Advances in Sparrow Search Algorithm: A Comprehensive Survey.

Authors:  Farhad Soleimanian Gharehchopogh; Mohammad Namazi; Laya Ebrahimi; Benyamin Abdollahzadeh
Journal:  Arch Comput Methods Eng       Date:  2022-08-22       Impact factor: 8.171

4.  Membrane Fouling Prediction Based on Tent-SSA-BP.

Authors:  Guobi Ling; Zhiwen Wang; Yaoke Shi; Jieying Wang; Yanrong Lu; Long Li
Journal:  Membranes (Basel)       Date:  2022-07-04
  4 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.