Literature DB >> 35892997

An Optimal WSN Node Coverage Based on Enhanced Archimedes Optimization Algorithm.

Thi-Kien Dao1, Shu-Chuan Chu2, Trong-The Nguyen1,3,4, Trinh-Dong Nguyen3,4, Vinh-Tiep Nguyen3,4.   

Abstract

Node coverage is one of the crucial metrics for wireless sensor networks' (WSNs') quality of service, directly affecting the target monitoring area's monitoring capacity. Pursuit of the optimal node coverage encounters increasing difficulties because of the limited computational power of individual nodes, the scale of the network, and the operating environment's complexity and constant change. This paper proposes a solution to the optimal node coverage of unbalanced WSN distribution during random deployment based on an enhanced Archimedes optimization algorithm (EAOA). The best findings for network coverage from several sub-areas are combined using the EAOA. In order to address the shortcomings of the original Archimedes optimization algorithm (AOA) in handling complicated scenarios, we suggest an EAOA based on the AOA by adapting its equations with reverse learning and multidirection techniques. The obtained results from testing the benchmark function and the optimal WSN node coverage of the EAOA are compared with the other algorithms in the literature. The results show that the EAOA algorithm performs effectively, increasing the feasible range and convergence speed.

Entities:  

Keywords:  coverage optimization; enhanced Archimedes optimization algorithm; optimization approach; wireless sensor network

Year:  2022        PMID: 35892997      PMCID: PMC9329719          DOI: 10.3390/e24081018

Source DB:  PubMed          Journal:  Entropy (Basel)        ISSN: 1099-4300            Impact factor:   2.738


1. Introduction

Wireless sensor networks (WSNs) are mainly composed of several autonomous devices called sensor nodes implemented for specific purposes and scattered in wide areas [1,2]. As wireless communication technology has improved and time has passed [3], WSNs have become more common in the information field [4]. They are utilized in various crucial fields, including the military, intelligent transportation, urban planning, industrial and agricultural automation, and environmental monitoring [5]. The sensor node’s job is to send captured information to the base station (BS) or the destination node by sensing and collecting ambient data, including sound vibration, pressure, temperature, and light intensity, among other things [6]. Due to their ease of implementation, cheap maintenance costs, and high flexibility, WSNs have successfully replaced wired networks and been embraced in the industrial field in recent years [7]. However, due to the nature of wireless communication, interference and conflict are invariably present during data transmission [8], and data packets may be lost or delayed past their planned deadline [9]. One of the most fundamental difficulties in WSNs is coverage, which is a critical metric for evaluating coverage optimization efforts. Because coverage affects the monitoring capability of the target monitoring area, it substantially impacts WSNs’ quality of service [10]. A node coverage optimization technique has been developed to increase the coverage of wireless sensor nodes in a big-data environment, considering the characteristics of large wireless sensor networks with limited node computing capabilities [11,12]. However, wireless sensor networks’ operating environment is complex and changing, and sensor energy is limited and cannot be supplemented [13]. Deployment of sensor nodes that is both sensible and effective reduces network expenses and energy consumption [14]. All WSN coverage applications try to deploy a minimal number of sensor nodes to monitor a defined target region of interest to improve coverage efficiency. Sensor nodes are typically placed randomly in the target monitoring region, resulting in an uneven distribution of nodes and limited coverage [15]. As a result, the network coverage control problem is the central research problem in wireless sensor networks [16]. Adopting an effective and acceptable network coverage control technique is beneficial to optimizing sensor node deployment to increase wireless sensor network performance. Sensor nodes are randomly placed around the monitoring region [17]. Strategically positioning sensor nodes in the monitoring zone is crucial to increasing WSN node coverage. For large-scale sensor node deployment challenges, logical and efficient deployment of WSNs has been demonstrated to be an NP-hard problem, and finding the best solution remains challenging [18]. Multiple nodes must be deployed to meet the monitoring needs, resulting in significant network redundancy coverage issues, the repeated transmission of vast amounts of data in the network, and a rise in the number of network nodes [19]. The metaheuristic algorithm is one of the promising approaches being examined as a solution for dealing with WSN node coverage in this scenario [20]. Metaheuristic algorithms can identify near-optimal solutions in a fair amount of time with limited nodes and computational resources, making them a convenient approach to the WSN coverage optimization problem [21]. Approximation optimization techniques with solutions that can tackle high-dimensional optimization problems effectively are known as metaheuristic algorithms [22]. Natural phenomena, such as human behaviors, physical sensations, animal swarm behaviors, and evolutionary concepts, are frequently used to inspire metaheuristic algorithms [23]. The metaheuristic optimization algorithms are widely used in a variety of fields, including technology, health, society, and finance, and are especially good at meeting time deadlines [20]. They are usually fairly easy to implement, having few parameters, being relatively simple to understand, and powerful, including selecting for biological nature, natural social swarm behavior, and autocatalytic physical phenomena, e.g., simulated annealing (SA) [24], genetic algorithms (GAs) [25,26], particle swarm optimization (PSO) [27], cat swarm optimization (CSO) [28], parallel PSO (PPSO) [29], ant colony optimization (ACO) [30], artificial bee colony (ABC) [31], bat algorithms (BA) [32,33], moth–flame optimization (MFO) [34,35], whale optimization algorithm (WOA) [36], flower pollination algorithm (FPA) [37,38], sine–cosine algorithm (SCA) [39,40], etc. A new metaheuristic optimization method based on suggested physical laws is the Archimedes optimization algorithm (AOA) [41], which is mimicked by the location update technique that uses object collisions for processing optimization equations. The optimization is carried out by modeling Archimedes’ buoyancy principle process: following a crash, the object progressively assumes neutral buoyancy. The AOA has advantages and the potential to optimize various engineering problems because of its fewer parameters, making it more easily understandable in programming. However, there are specific problems with the AOA algorithm approach to particular issues, such as the solution’s slow convergence time and poor quality. This paper suggests an enhanced Archimedean algorithm (EAOA) for the global optimization problems and node coverage optimization in WSN deployment. The difficulties of WSN nodes’ uneven distribution and low coverage in the random deployment of WSN monitoring applications are approached based on the EAOA. The entire WSN monitoring area can be divided into multiple sub-areas, and then node optimization coverage can be implemented in each sub-area based on evaluating the objective function values. The modeled objective function is calculated by the all-nodes coverage area ratio of the probability of the deployed surface 2D WSN monitoring area of the network. We implemented the EAOA by adapting its updating equations using reverse learning and multidirection strategies to overcome the limitations of its original approach. The following item list briefly highlights the contributions of this paper’s innovations: Offering strategies for enhancing the AOA to prevent the original algorithm’s drawbacks in dealing with complex situations, evaluating the recommended method’s performance by using the CEC2017 test suite, and comparing the proposed method’s results with the other algorithms in the literature. Establishing the objective function of the optimal WSN node coverage issues in applying the EAOA and AOA for the first time, and analyzing and discussing the results of the experiment in comparison with swarm intelligence optimization algorithms. The paper’s remaining parts are organized as follows: Section 2 describes the WSN node coverage model as a statement problem, and reviews the AOA algorithm as related work. Section 3 presents the proposed EAOA, and evaluates its performance under the test suite. Section 4 offers the EAOA for tackling the node coverage issues by applying the EAOA algorithm and analyzing the simulation results. The conclusions are presented in Section 5.

2. System Definition

This section presents the WSN node coverage model as the problem statement, and the original algorithm—called the Archimedes optimization algorithm (AOA)—as a recent metaheuristic optimization algorithm. The subsections are reviewed as follows.

2.1. WSN Node Coverage Model

The coverage optimization problem is the desired location of each deployed node, with a fixed sensing radius for each sensor. Each node needs to be deployed with a limited sensing radius, and each sensor can only sense and find within its sensing radius. Detection within its sensing radius is a workable solution to the coverage problem. Assuming that the WSN is deployed in a two-dimensional (2D) monitoring area of W × L m2, with nodes set up randomly [15,42], then if S is a set of nodes denoted as , the coordinates of each node can be represented as (). A sensor node’s sensing range is a circle, with the center of the sensing radius as its radius. The model of a two-dimensional WSN monitoring area network is assumed as follows: The sensing radius of each sensor node is , and the communication radius is , both measured in meters, with . The sensor nodes can normally communicate, have sufficient energy, and can access time and data information. The sensor nodes have the same parameters, structure, and communication capabilities. The sensor nodes can move freely and update their location information in time. Let be a set of target monitoring points, ; the coordinate is () in the two-dimensional WSN monitoring area. If the distance between the target monitoring point and any sensor node is less than or equal to the sensing radius , then is covered by the sensor nodes. With the sensor node and goal monitoring point , the Euclidean distance is defined as follows: where is the distance from node ( to node (. The node sensing model is set on the sensing radius if is greater than or equal to —the probability that the target is set to 1; otherwise, it is set to 0. The probability formula is given as follows: where is the probability between the sensor node and goal monitoring point . The sensor nodes can work cooperatively by affecting the neighbor nodes of the deployed two-dimensional WSN monitoring area. Whenever any target monitoring point can be covered by more than one sensor simultaneously, the probability of monitoring the target point is given by the following formula: The ratio of the total area covered by all sensor nodes in the monitoring area to that area’s overall size is known as the coverage rate. Accordingly, the probability ratio to the network’s surface 2D WSN monitoring area is used to calculate the coverage ratio. where is the WSN nodes’ coverage ratio in the target point reaching area, is the probability of the target point reaching sensed node monitoring, and is the deployed area of the desired surface 2D network.

2.2. Archimedes Optimization Algorithm (AOA)

The AOA is a recent metaheuristic optimization algorithm based on Archimedes’ buoyancy principle’s physical principles [41]. The position of its object is updated by imitating the process of the object gradually exhibiting neutral buoyancy following a collision. The AOA algorithm provides the individual population by immersing objects with volume, density, and acceleration properties. The items can determine their position in the fluid based on these attributes. The characteristics and places of the object are randomly initialized at the start of the process. The AOA updates the object’s volume, density, and acceleration during processing optimization. The object’s position is updated based on its individual qualities. Initialization, updating object properties, updating the object’s status, and evaluation are the significant processing steps of the AOA. of the object is conducted as follows: where is a candidate solution vector -th of the object population size , ; the boundaries and are the upper and lower boundaries, respectively; and the variable is a -dimensional vector generated randomly between [0, 1]. The variables of acceleration, volume, and density of the -th object are noted as , , and , respectively; , , and . The position and attributes of the optimal object—such as , , , and —are the selected objects with the best fitness values according to the evaluation of each object. phase: During the iteration, the volume and density of the object are updated according to the following formula: where and denote the volume and density of the -th object in the iteration, respectively. The simulated collisions between objects in the AOA are mimicked for the optimization process; as time goes on with iterations, the algorithm gradually reaches equilibrium. A transform variable is used as a simulation of the process to realize the algorithm’s transformation from searching exploration to exploitation, as follows: where is the transition transform variable, while and are the maximum number of iterations and the current number of iterations, respectively. gradually increases to 1 over time. , meaning that one second of the iteration is in the exploration phase. The update acceleration of object attributes is related to the collision objects. where , , and are the density, volume, and acceleration of random material (), respectively. If , there is a collision between objects, and the acceleration updates the formula of object in iteration ; otherwise, there is no collision between objects. The normalization strategy for the acceleration can be updated as follows: where represents the normalized acceleration of the i-th object in the iteration, while and are the normalized ranges, which are set to 0.8 and 0.2, respectively. is conducted as follows: If (exploration phase), the position update formula of object i at the iteration is helpful to search from global to local and converge in the region where the optimal solution exists; otherwise, it is a searching exploitation phase for the positional updating. When the object is far from the best position, the acceleration value is enormous, and the object is in the exploration phase. When the acceleration value is small, the object is close to the optimal solution. The exploitation phase can be described as follows: where is a constant that is set to 2, and is the density factor that decreases over time, i.e., . The acceleration changes from big to small, indicating the algorithm’s transition from exploration to exploitation, respectively, which helps the object approach the optimal global solution. where represents the constant t; is a variable proportional to the transfer operator—the percentage used to attain the best position—; and is the direction of motion, and its expression is as follows: where . objective function involves computing the fitness values for the objective function after updating the object’s position each iteration time. The model with objective function is used for fitness value evaluation by evaluating each object that is recorded with the best fitness value found in each position, e.g., , , , and are updated for the next iterations or generations.

3. Enhanced Archimedes Optimization Algorithm

In order to enhance the population of diverse objects, an enhanced version of the Archimedes optimization algorithm (EAOA) based on the opposing learning and diversity guiding techniques is presented in this section. The suggested processes are offered first, followed by a detailed presentation of the evaluation and discussion findings.

3.1. Enhanced Archimedes Optimization Algorithm

The AOA is a new metaheuristic algorithm with several advantages, including ease of understanding and implementation, along with local search capability. Still, it has drawbacks, such as jumping out of the optimal local operation, slow convergence, or vulnerability to local optima when dealing with complex problems, such as optimal WSN node coverage issues. In the original expression in Equation (13), the direction of motion F has just two motion directions. For complicated problems, the space may have more scales in terms of motion in space. We can exploit this to increase the number of search directions in complex spaces. A variable of the direction guiding factor is used as an equivalent to the direction value. An alternative formula of motion direction can be expressed as follows: where is an alternative direction guiding factor, and is a random number [0, 1] for making the different search values of directions. The original and reversed solutions are sorted fitness values based on objective function issues to convert objects in a seeking, forward-exploiting procedure in the optimization problem space. The agents in the optimization space can swiftly converge to the task of the ideal solution by identifying new objects with the best fitness ratings by using direct vetting or other optimization strategies to establish new things in the solution space. A new solution set can be generated by applying reverse learning with a specific rate to join the original for further optimization. Let and be solutions of forwarding and corresponding inverse sets, where, . A range of the opposite solution set can be expressed as . The same idea of the opposite learning applied to a new solution is as follows: where is a variable as an adjustment coefficient for generating and affecting a new solution object set. A portion of the worst solution—e.g., about 15% of sorting values of evaluation object positions—is eliminated to be used for generating a new object set in dimension d of the solution space. The adjustment coefficient is calculated as follows: where is a random function in the range from to . In the experiment, can be set to −0.5 and set to 0.5. D is the dimension of problem space, while is the distance between the ideal solution and the one that is closest to optimal. The adjustment coefficient can be applied to the exploiting search of the algorithm for generating and affecting a new solution object set merged into Equation (17). The strategies and equations of reverse learning and multiverse-directing can be hybridized into updated formulas for generating new solutions. An update of the position of the objects is conducted as follows: Algorithm 1 depicts the pseudocode of the enhanced Archimedes optimization algorithm (EAOA).

3.2. Experimental Results for Global Optimization

The suggested algorithm’s potential performance needs to be tested and verified with the benchmark functions. The CEC 2017 [43] test suite has 29 different test functions to evaluate the EAOA algorithm. There are various types of complexity and dimension functions in the test suite, e.g., f1~f3: unimodal, f4~f10: multimodal, f11~f20: hybrid, and f21~f29: compound test functions. The achieved results of the EAOA are compared not only with the original AOA [41], but also with other selected popular algorithms in the literature, e.g., genetic algorithms (GAs) [25], simulated annealing (SA) [24], particle swarm optimization (PSO) [27], moth–flame optimization (MFO) [34], improved MFO (IMFO) [35], flower pollination algorithm (FPA) [37], sine–cosine algorithm (SCA) [39], enhanced SCA (ESCA) [40], parallel PSO (PPSO) [29], and parallel bat algorithm (PBA) [33]. An expression of is a different error value between the function minimum value and the obtained result value of the i-th function. The fundamental conditions are set for all algorithms to ensure that the experiment is fair, e.g., the population size is set to 40; the maximum number of iterations is set to 1000; the number of dimensions is set to 30; the solution range of all of the test functions is set to [−100, 100]; and the number of runs is set to 25. Table 1 lists the basic parameters of each algorithm.
Table 1

Algorithm settings for parameters and variables.

AlgorithmsSetting Parameters
EAOA C1=2.1, C2=5.6, C3=1.95, C4=0.65
AOA [41] C1=2.1, C2=5.6, C3=1.95, C4=0.65
GA [25] Rmu=0.1, Rcr=0.9
SA [24] P=0.6, α=0.8, τ=0.05, SN=14.41
PSO [27] Vmax=10, Vmin=10, ω=0.9 to 0.4, c1=c2=1.49455
PPSO [29] G=2, R=10, Vmax=10, Vmin=10, ω=0.9 to 0.4, c1=c2=1.49465
PBA [33] G=2, R=10, A0=0.7, r0 =0.15, α=0.25, γ=0.16
FPA [37] Pswitch=0.65, λ=1.5, s0=0.1
MFO [34] a=1, b=1
IMFO [35] a=1, b=1 , ω=0.9 to 0.4
WOA [36] a=2 to 0, b=1, l=[1,1]
SCA [39] r1, r3, =rand(0,2), r2 [0, 2π], r4=rand(0,1)
ESCA [40] r1, r3, =rand(0,2), r2 [0, 2π], r4=rand(0,1) , ω=0.9 to 0.5
The obtained outcomes of the proposed EAOA approach can be verified by several cases—such as the affected strategies with the original algorithm—and compared with the other algorithms. First, the outcomes of several implemented tactics are contrasted with those of the original AOA algorithm. The findings from the EAOA are then contrasted with those from other algorithms. Table A1 compares the affected strategies in applying the EAOA with the original AOA algorithm, and verifies the impact of the suggested techniques used in the EAOA compared to the original AOA algorithm. The data values of the mean outcomes of 25 runs show the best obtained global optimal results, as well as the data on runtime and CPU execution. It can be seen that in some cases strategies 1 and 2 are better than the original algorithm. In most test function cases, the combined strategies 1 and 2 in the proposed EAOA can produce better results than the AOA, and the runtime is not much longer than that of the AOA.
Table A1

Verifying the impact of the suggested techniques used in the EAOA in comparison with the original AOA algorithm.

Fun TestOriginalSuggested Strategy 01Suggested Strategy 02Suggested Strategies 01 and 02
AOAMultidirection Opposite LearningEAOA
MeanCPU Runtime (s)MeanCPU Runtime (s)MeanCPU Runtime (s)MeanCPU Runtime (s)
f12.95 × 10−137.931.86 × 10−136.101.91 × 10−134.30 1.71 × 10−1 38.52
f22.71 × 10+132.761.94 × 10+134.02 1.61 × 10+1 34.121.65 × 10+138.32
f33.66 × 10−145.342.58 × 10−147.09 6.52 × 10−2 47.232.44 × 10−153.04
f43.02 × 10−144.161.45 × 10−145.864.59 × 10−246.00 1.27 × 10−2 52.12
f57.99 × 10−240.327.84 × 10−342.811.38 × 10−242.00 5.38 × 10−3 48.03
f65.58 × 10−185.442.11 × 10−188.73 6.21 × 10−2 89.001.92 × 10−198.89
f72.21 × 10−1203.52 1.07 × 10−1 221.312.44 × 10−1212.101.26 × 10−1237.18
f86.32 × 100117.12 6.61 × 10−1 121.411.97 × 100122.007.25 × 10−1136.23
f97.20 × 100229.604.82 × 100234.614.25 × 100235.010 4.36 × 100 251.72
f102.25 × 100224.612.63 × 10−1233.422.05 × 10−1234.10 2.10 × 10−2 263.69
f114.95 × 10+3274.65 1.77 × 10+3 275.318.09 × 10+3278.011.06 × 10+3278.59
f121.66 × 10+2229.443.65 × 10+1238.288.09 × 10+1239.002.29 × 10+1268.40
f133.58 × 10+1120.012.87 × 100124.613.30 × 10+1125.10 1.53 × 100 140.28
f142.96 × 10+196.261.62 × 100100.711.09 × 10+1101.10 1.26 × 100 113.41
f152.05 × 100221.767.88 × 10−1231.314.74 × 10−1231.107.27 × 10−1259.42
f164.73 × 10−1126.721.85 × 10−1131.612.59 × 10−1132.01 1.30 × 10−1 148.34
f174.04 × 10+2223.695.63 × 10+1232.315.53 × 10+2233.107.90 × 10+1262.71
f182.49 × 10+2100.813.70 × 10−1104.351.46 × 10+1105.10 1.09 × 100 117.92
f194.06 × 10−1206.40 3.24 × 10−1 214.363.79 × 10−1215.003.86 × 10−1241.45
f205.87 × 10−1298.564.11 × 10−1310.074.34 × 10−2311.00 3.98 × 10−2 349.25
f216.51 × 10−1327.362.25 × 10−1339.988.29 × 10−2341.002.10 × 10−1384.15
f228.94 × 10−1312.96 6.22 × 10−1 325.767.03 × 10−1326.006.34 × 10−1367.09
f231.02 × 10303.367.63 × 10−1315.05 5.72 × 10−2 316.007.59 × 10−2354.87
f247.38 × 10−1282.246.63 × 10−1294.324.75 × 10−1294.00 4.12 × 10−1 331.25
f253.28 × 100206.40 7.26 × 10−1 215.361.51 × 100215.007.74 × 10−1243.15
f268.53 × 10−1253.448.03 × 10−1263.22 2.78 × 10−2 264.007.78 × 10−1297.17
f277.28 × 10−1273.447.74 × 10−1265.455.19 × 10−1284.00 7.41 × 10−2 295.92
f282.37 × 100225.601.09 × 100234.30 3.34 × 10−1 235.009.39 × 10−1263.91
f292.15 × 10+3221.768.37 × 10+1230.313.14 × 10+2231.12 4.67 × 10+1 259.82
Avg.1.72 × 10−1198.016.88 × 10−1199.913.15 × 10−1199.47 4.45 × 10−2 219.12

The bold data values in each row of the Table are the best ones in each pair compared with the EAOA approach.

Moreover, the obtained results from the EAOA were also further evaluated to verify the proposed approach’s performance in the presentation. The findings of the EAOA compared with the other algorithms—e.g., GA [25], PSO [27], BA [32], PPSO [29], MFO [34], and WOA [36] algorithms—are presented in Table A2, Table A3, Table A4 and Table A5 and Figure A1. The data values in Table A2, Table A3 and Table A4 are the Mean, Best, and Std.—a standard deviation that measures variables for analyzing the algorithm’s performance. The values of Mean, Best, and Std. are for assessing the search capability, quality, and resilience of the algorithm, respectively. Table A2, Table A3 and Table A4 compare the results of the proposed EAOA with the other popular metaheuristic algorithms in the literature, e.g., the GA [25], PSO [27], BA [32], PPSO [29], MFO, [34], and WOA [36] algorithms. The highlighted data values in each row of Table A2, Table A3 and Table A4 are the best in each pair comparing the EAOA-obtained results with the others for testing functions with a suitable format and layout. The symbols Win, Loss, and Draw at the end of each table provide a brief statistical summary. It can be seen that the proposed EAOA algorithm has the highest number of ‘Wins’. This means that the EAOA produces better results than the other algorithms, and that the EAOA has an excellent optimization performance.
Table A2

The performance presentation of the EAOA, SA, and GA for the CEC 2017 test suite with each paired comparison.

FunsGASAEAOA
MeanBestStd.MeanBestStd.MeanBestStd.
f15.66 × 10−51.46 × 10−55.19 × 10−57.16 × 10−5 1.25 × 10−5 2.38 × 10−5 1.29 × 10−5 2.71 × 10−5 1.11 × 10−5
f23.72 × 10−1 1.54 × 10−1 1.01 × 10−13.78 × 10+12.21 × 10+19.86 × 10+1 3.57 × 10−1 1.85 × 10−1 1.11 × 10−1
f32.57 × 10−11.58 × 10−1 5.11 × 10−2 4.92 × 10−12.66 × 10−11.28 × 10−1 2.33 × 10−1 1.44 × 10−1 5.52 × 10−2
f42.31 × 10−11.45 × 10−14.84 × 10−24.58 × 10−13.02 × 10−1 4.34 × 10−2 1.91 × 10−1 1.11 × 10−1 4.59 × 10−2
f53.90 × 10−27.86 × 10−31.68 × 10−2 1.12 × 10−2 7.99 × 10−21.63 × 10−22.57 × 10−2 5.38 × 10−3 1.38 × 10−2
f63.28 × 10−12.11 × 10−17.81 × 10−28.30 × 10−15.58 × 10−11.26 × 10−1 2.68 × 10−1 1.92 × 10−1 6.21 × 10−2
f71.95 × 10−1 1.07 × 10−1 3.69 × 10−23.59 × 10−12.21 × 10−18.33 × 10−2 1.66 × 10−1 1.26 × 10−1 2.44 × 10−2
f83.43 × 1001.21 × 10+1 1.64 × 100 1.32 × 100 6.32 × 1004.29 × 1003.23 × 100 7.17 × 10−1 1.97 × 100
f9 6.43 × 100 4.81 × 1001.19 × 1008.79 × 1007.20 × 100 1.09 × 100 6.53 × 100 4.36 × 100 1.25 × 100
f103.99 × 10−1 2.03 × 10−1 1.03 × 10−15.33 × 1001.20 × 1004.24 × 100 3.81 × 10−1 2.10 × 10−1 1.01 × 10−1
f111.02 × 10+41.77 × 10+39.36 × 10+32.81 × 10+54.45 × 10+41.90 × 10+5 7.58 × 10+3 1.06 × 10+3 8.09 × 10+3
f129.53 × 10+23.65 × 10+11.07 × 10+2 9.30 × 10+2 1.66 × 10+21.10 × 10+39.47 × 10+1 2.29 × 10+1 8.09 × 10+1
f133.62 × 10+19.87 × 1003.51 × 10+19.11 × 10+3 9.80 × 100 2.89 × 10+2 9.23 × 10+1 9.83 × 100 3.30 × 10+1
f141.21 × 10+11.62 × 100 7.68 × 100 1.66 × 10+22.96 × 10+11.15 × 10+2 1.05 × 10+1 1.26 × 100 1.09 × 10+1
f15 1.57 × 100 7.88 × 10−14.83 × 10−13.69 × 1002.05 × 100 4.63 × 10−1 1.66 × 100 7.27 × 10−1 4.74 × 10−1
f165.97 × 10−11.85 × 10−12.70 × 10−11.30 × 1004.73 × 10−13.87 × 10−1 5.77 × 10−1 1.30 × 10−1 2.59 × 10−1
f176.05 × 10+2 5.63 × 10+1 7.15 × 10+21.01 × 10+44.04 × 10+21.60 × 10+4 4.81 × 10+2 7.90 × 10+1 5.53 × 10+2
f18 9.73 × 100 3.70 × 10−1 1.74 × 10+12.11 × 10+42.49 × 10+21.87 × 10+41.10 × 10+11.09 × 100 1.46 × 10+1
f197.95 × 10−1 3.24 × 10−1 3.13 × 10−1 1.29 × 10−1 4.06 × 10−13.70 × 10−17.97 × 10−13.86 × 10−1 2.79 × 10−1
f204.87 × 10−14.11 × 10−1 3.48 × 10−2 7.39 × 10−15.87 × 10−18.77 × 10−2 4.80 × 10−1 3.98 × 10−1 4.34 × 10−2
f213.46 × 10−12.25 × 10−1 6.95 × 10−2 8.38 × 1006.51 × 10−12.32 × 100 2.95 × 10−1 2.10 × 10−1 8.29 × 10−2
f227.69 × 10−1 6.22 × 10−1 5.70 × 10−21.21 × 1009.94 × 10−11.04 × 10−1 7.46 × 10−1 6.79 × 10−1 4.63 × 10−2
f238.64 × 10−17.63 × 10−16.44 × 10−21.26 × 100 1.02 × 10−1 1.34 × 10−1 8.49 × 10−1 7.59 × 10−1 5.72 × 10−2
f247.34 × 10−16.63 × 10−1 3.85 × 10−2 8.77 × 10−17.38 × 10−17.48 × 10−2 7.00 × 10−1 6.12 × 10−1 4.75 × 10−2
f25 2.79 × 100 7.26 × 10−1 1.62 × 1008.04 × 1003.28 × 1001.63 × 1003.45 × 1007.74 × 10−1 1.51 × 100
f268.44 × 10−18.03 × 10−12.31 × 10−21.13 × 1008.53 × 10−11.82 × 10−1 8.29 × 10−1 7.78 × 10−1 2.78 × 10−2
f278.55 × 10−17.74 × 10−15.76 × 10−21.02 × 1008.28 × 10−11.38 × 10−1 8.17 × 10−1 7.41 × 10−1 5.19 × 10−2
f281.74 × 1001.09 × 1003.76 × 10−13.66 × 100 2.37 × 10−1 6.31 × 10−1 1.49 × 100 9.39 × 10−1 3.34 × 10−1
f291.12 × 10+38.37 × 10+11.16 × 10+34.33 × 10+44.15 × 10+33.70 × 10+4 3.55 × 10+2 4.89 × 10+1 3.14 × 10+2
Win597655201819
Lose21182022222291110
Draw344324000

The bold data values in each row of the Table are the best ones in each pair compared with the EAOA approach.

Table A3

The performance presentation of the EAOA, FPA, and PSO for the CEC 2017 test suite with each paired comparison.

FunsFPAPSOEAOA
MeanBestStd.MeanBestStd.MeanBestStd.
f12.24 × 10−21.19 × 10−25.79 × 10−2 4.37 × 10−2 2.25 × 10−2 1.36 × 10−2 1.34 × 10−32.83 × 10−21.16 × 10−2
f21.15 × 1007.23 × 10−1 2.60 × 10−1 8.74 × 10−15.18 × 10−16.01 × 10−1 7.58 × 10−1 3.92 × 10−1 4.36 × 10−1
f3 2.11 × 10−1 1.43 × 10−13.47 × 10−12.34 × 10−1 1.21 × 10−1 3.40 × 10−12.43 × 10−11.50 × 10−1 5.77 × 10−2
f43.45 × 10−12.40 × 10−15.46 × 10−22.55 × 10−11.63 × 10−1 3.90 × 10−2 2.00 × 10−1 1.16 × 10−1 4.80 × 10−2
f56.68 × 10−22.54 × 10−21.98 × 10−27.62 × 10−24.46 × 10−2 1.20 × 10−2 2.68 × 10−2 5.63 × 10−3 1.44 × 10−2
f64.30 × 10−13.80 × 10−1 2.70 × 10−2 3.26 × 10−12.50 × 10−14.61 × 10−2 2.81 × 10−1 2.01 × 10−1 6.49 × 10−2
f72.59 × 10−11.96 × 10−13.70 × 10−21.98 × 10−11.36 × 10−12.82 × 10−2 1.73 × 10−1 1.32 × 10−1 1.25 × 10−1
f84.69 × 100 1.11 × 10−1 2.95 × 1004.42 × 1001.83 × 100 1.59 × 100 3.37 × 100 7.49 × 10−12.06 × 100
f99.68 × 1007.43 × 100 1.17 × 100 7.17 × 1004.82 × 100 5.09 × 100 6.82 × 100 4.56 × 100 1.30 × 100
f104.07 × 10−12.77 × 10−16.08 × 10−2 3.14 × 10−1 2.25 × 10−1 4.61 × 10−2 3.98 × 10−1 2.19 × 10−1 1.06 × 10−1
f11 3.01 × 10+1 3.25 × 10+1 3.26 × 10+13.13 × 10+13.15 × 10+13.15 × 10+13.40 × 10+13.11 × 10+13.10 × 10+1
f127.20 × 10+23.72 × 10+28.30 × 10+21.11 × 10+24.52 × 10+1 4.62 × 10+1 9.90 × 10+1 2.39 × 10+1 8.46 × 10+1
f137.34 × 10+18.89 × 1006.28 × 10+1 2.59 × 10+1 5.45 × 10−1 2.66 × 10+1 3.06 × 10+11.59 × 1003.44 × 10+1
f143.03 × 10+28.32 × 10+12.17 × 10+24.57 × 10+11.92 × 10+12.71 × 10+1 1.10 × 10+1 1.32 × 100 1.14 × 10+1
f152.01 × 1001.09 × 100 3.70 × 10−1 2.01 × 1001.26 × 1004.51 × 10−1 1.73 × 100 7.59 × 10−1 4.95 × 10−1
f167.50 × 10−12.41 × 10−1 2.38 × 10−1 6.40 × 10−11.93 × 10−12.80 × 10−1 6.03 × 10−1 1.36 × 10−1 2.70 × 10−1
f17 9.05 × 10+1 5.47 × 10+11.23 × 10+29.08 × 10+1 1.09 × 10+1 8.55 × 10+1 1.02 × 10+21.68 × 10+11.17 × 10+2
f181.50 × 10+24.30 × 10+11.32 × 10+22.42 × 10+11.64 × 1002.35 × 10+1 1.38 × 100 1.37 × 10−1 1.84 × 100
f197.85 × 10−14.36 × 10−12.39 × 10−1 7.69 × 10−1 5.01 × 10−1 1.78 × 10−1 8.33 × 10−1 4.03 × 10−1 2.91 × 10−1
f206.30 × 10−15.39 × 10−14.59 × 10−25.71 × 10−14.96 × 10−1 4.14 × 10−2 5.02 × 10−1 4.16 × 10−1 4.53 × 10−2
f211.45 × 1002.33 × 10−13.18 × 1007.42 × 10−1 2.04 × 10−1 2.02 × 100 3.08 × 10−1 2.20 × 10−1 8.66 × 10−2
f221.03 × 100 7.09 × 10−1 9.12 × 10−19.97 × 10−18.73 × 10−19.76 × 10−1 7.80 × 10−1 7.10 × 10−1 4.83 × 10−2
f231.09 × 1009.27 × 10−17.47 × 10−21.05 × 1008.69 × 10−18.62 × 10−2 8.87 × 10−1 7.93 × 10−1 5.98 × 10−2
f247.17 × 10−16.52 × 10−1 3.44 × 10−2 7.17 × 10−1 6.61 × 10−13.69 × 10−27.32 × 10−1 6.39 × 10−1 4.96 × 10−2
f253.91 × 1006.47 × 10−12.82 × 100 3.26 × 100 5.36 × 10−1 2.80 × 1003.60 × 1008.08 × 10−1 1.57 × 100
f269.56 × 10−18.58 × 10−18.74 × 10−29.93 × 10−18.44 × 10−17.07 × 10−2 8.66 × 10−1 8.12 × 10−1 2.90 × 10−2
f277.98 × 10−17.24 × 10−1 3.70 × 10−2 7.86 × 10−1 6.96 × 10−1 4.20 × 10−28.54 × 10−17.74 × 10−15.43 × 10−2
f282.03 × 1001.31 × 1004.27 × 10−12.38 × 1001.47 × 1004.32 × 10−1 1.56 × 100 9.81 × 10−1 3.49 × 10−1
f295.47 × 10+31.89 × 10+32.54 × 10+32.89 × 10+34.93 × 10+21.89 × 10+3 3.71 × 10+2 1.01 × 10+2 3.28 × 10+2
Win5567710181813
Lose232321212112111016
Draw332111010

The bold data values in each row of the Table are the best ones in each pair compared with the EAOA approach.

Table A4

The performance presentation of the EAOA, MFO, and SCA for the CEC 2017 test suite with each paired comparison.

FunsMFOSCAEAOA
MeanBestStd.MeanBestStd.MeanBestStd.
f14.60 × 10−12.99 × 10−17.04 × 10−12.41 × 10−1 1.23 × 10−1 4.80 × 10−1 1.34 × 10−1 2.83 × 10−1 1.16 × 10−1
f22.33 × 10+29.80 × 10+11.30 × 10+2 1.41 × 10+1 9.03 × 10+12.65 × 10+13.73 × 10+1 1.93 × 10+1 1.16 × 10+1
f36.57 × 1004.83 × 1001.17 × 1002.97 × 10−1 1.36 × 10−1 8.95 × 10−1 2.43 × 10−1 1.50 × 10−1 5.77 × 10−2
f46.23 × 10−15.46 × 10−14.52 × 10−25.24 × 10−14.54 × 10−1 3.69 × 10−2 2.00 × 10−1 1.16 × 10−1 4.80 × 10−2
f51.38 × 10−11.21 × 10−1 1.43 × 10−2 6.61 × 10−2 6.50 × 10−21.54 × 10−26.68 × 10−2 5.63 × 10−3 1.44 × 10−2
f61.79 × 100 1.34 × 10−1 1.66 × 1009.48 × 10−17.89 × 10−19.52 × 10−2 2.81 × 10−1 2.01 × 10−1 6.49 × 10−2
f75.83 × 10−15.16 × 10−12.83 × 10−24.86 × 10−13.81 × 10−1 2.13 × 10−2 1.73 × 10−1 1.32 × 10−1 2.54 × 10−2
f8 2.38 × 10−1 1.78 × 1003.28 × 1001.17 × 10+15.84 × 1003.05 × 1003.37 × 100 7.49 × 10−1 2.06 × 100
f99.99 × 1009.31 × 100 3.36 × 10−1 1.23 × 10+11.05 × 10+16.06 × 10−1 6.82 × 100 4.56 × 100 1.30 × 100
f107.53 × 1003.80 × 1001.73 × 100 3.00 × 10−1 1.14 × 10−1 9.63 × 10−13.98 × 10−12.19 × 10−1 1.06 × 10−1
f112.90 × 10+61.62 × 10+65.79 × 10+51.78 × 10+69.66 × 10+5 5.17 × 10+5 7.92 × 10+5 1.10 × 10+5 8.45 × 10+5
f127.12 × 10+53.18 × 10+52.50 × 10+53.12 × 10+5 8.18 × 10+4 2.17 × 10+5 9.90 × 10+4 8.39 × 10+4 8.46 × 10+4
f132.14 × 10+2 2.04 × 10+1 9.37 × 10+17.63 × 10+22.93 × 10+15.99 × 10+2 3.06 × 10+1 2.59 × 10+1 3.44 × 10+1
f141.93 × 10+41.54 × 10+39.89 × 10+37.40 × 10+39.60 × 10+24.59 × 10+3 1.10 × 10+1 1.32 × 100 1.14 × 10+1
f153.58 × 100 3.14 × 10−1 2.28 × 100 3.76 × 10−1 2.59 × 1004.23 × 1001.73 × 1007.59 × 10−1 4.95 × 10−1
f161.42 × 1001.09 × 100 1.39 × 10−1 1.43 × 1006.64 × 10−13.42 × 10−1 6.03 × 10−1 1.36 × 10−1 2.70 × 10−1
f17 3.40 × 10+2 8.93 × 10+21.31 × 10+39.63 × 10+31.88 × 10+37.49 × 10+35.03 × 10+2 8.25 × 10+1 5.78 × 10+2
f185.66 × 10+42.32 × 10+42.43 × 10+41.32 × 10+43.22 × 10+37.60 × 10+3 1.15 × 10+1 1.14 × 100 1.53 × 10+1
f191.17 × 1009.15 × 10−1 1.15 × 10−1 1.17 × 1006.21 × 10−12.66 × 10−1 8.33 × 10−1 4.03 × 10−1 2.91 × 10−1
f20 8.92 × 10−2 8.28 × 10−2 9.98 × 10−1 8.06 × 10−17.10 × 10−14.72 × 10−25.02 × 10−1 4.16 × 10−2 4.53 × 10−2
f219.34 × 1007.19 × 1001.04 × 1003.19 × 100 1.95 × 10−1 5.98 × 10−1 3.08 × 10−1 2.20 × 10−1 8.66 × 10−2
f221.28 × 100 1.22 × 10−1 4.01 × 10−11.14 × 1001.05 × 100 4.84 × 10−2 7.80 × 10−1 7.10 × 10−1 4.83 × 10−1
f231.38 × 1001.27 × 1006.06 × 10−21.26 × 1001.13 × 100 5.20 × 10−2 8.87 × 10−1 7.93 × 10−1 5.98 × 10−2
f243.78 × 1002.27 × 1005.41 × 10−1 1.72 × 10−1 1.28 × 10−1 2.58 × 10−1 7.32 × 10−1 6.39 × 10−2 4.96 × 10−1
f258.23 × 1005.94 × 100 9.24 × 10−1 6.57 × 1003.22 × 1001.84 × 100 3.60 × 100 8.08 × 10−1 1.57 × 100
f261.20 × 1001.13 × 1003.57 × 10−1 1.20 × 10−1 8.15 × 10−11.74 × 10−18.66 × 10−1 8.12 × 10−2 2.90 × 10−1
f273.39 × 100 2.15 × 10−1 6.11 × 10−12.06 × 1004.31 × 10−18.16 × 10−1 8.54 × 10−1 7.74 × 10−1 5.43 × 10−1
f283.20 × 1002.76 × 100 2.16 × 10−1 3.17 × 1001.92 × 1004.95 × 10−1 1.56 × 100 9.81 × 10−1 3.49 × 10−1
f298.17 × 10+45.16 × 10+42.12 × 10+44.90 × 10+41.70 × 10+42.05 × 10+4 3.71 × 10+2 1.51 × 10+2 3.28 × 10+2
Win456766212017
Lose2321212122238914
Draw232110000

The bold data values in each row of the Table are the best ones in each pair compared with the EAOA approach.

Table A5

Wilcoxon signed-rank results of the test pairs of the pairwise algorithms’ results between the EAOA and other algorithms, i.e., PBA [33], WOA [36], PPSO [29], AOA [41], IFMO [35], and ESCA [40].

FunsPBA [33]WOA [36]PPSO [29]AOA [41]IFMO [35]ESCA [40]EAOA-Itself
f1 2.4018 × 10−7 1.4018 × 10−111.7018 × 10−111.1205 × 10−56.9641 × 10−82.5668 × 10−7~N/A
f21.4018 × 10−111.4018 × 10−111.4018 × 10−112.2080 × 10−77.1665 × 10−31.3749 × 10−2~N/A
f31.4018 × 10−111.4018 × 10−118.5710 × 10−111.8717 × 10−21.8717 × 10−24.6578 × 10−3~N/A
f41.4018 × 10−111.4018 × 10−111.4018 × 10−114.8753 × 10−111.5456 × 10−22.7237 × 10−5~N/A
f51.4018 × 10−111.5447 × 10−111.4018 × 10−111.8376 × 10−95.3326 × 10−54.0332 × 10−11~N/A
f61.4018 × 10−111.4018 × 10−111.4018 × 10−114.4659 × 10−106.5678 × 10−49.8637 × 10−4~N/A
f71.4018 × 10−111.4018 × 10−111.7018 × 10−119.4096 × 10−111.6922 × 10−36.2370 × 10−4~N/A
f81.4018 × 10−113.6674 × 10−111.8745 × 10−11 8.0856 × 10−2 1.0244 × 10−1 1.2706 × 10−2~N/A
f94.8753 × 10−111.4018 × 10−111.2873 × 10−82.0041 × 10−93.9045 × 10−1 2.6004 × 10−1 ~N/A
f101.4018 × 10−111.4018 × 10−111.4018 × 10−11 5.8296 × 10−1 9.7754 × 10−11.5366 × 10−3~N/A
f111.4018 × 10−111.4018 × 10−111.4018 × 10−111.5439 × 10−9 1.4703 × 10−1 3.1620 × 10−6~N/A
f121.4018 × 10−111.4018 × 10−116.4699 × 10−111.5447 × 10−111.3749 × 10−24.1212 × 10−2~N/A
f137.1071 × 10−117.8055 × 10−117.1071 × 10−111.2847 × 10−4 8.7693 × 10−1 8.2178 × 10−1 ~N/A
f141.4018 × 10−111.4018 × 10−112.5021 × 10−111.4018 × 10−113.7194 × 10−23.6588 × 10−9~N/A
f151.4018 × 10−111.4018 × 10−114.0332 × 10−112.9096 × 10−27.2487 × 10−17.3779 × 10−2~N/A
f161.4018 × 10−113.7291 × 10−10 6.1854 × 10−1 2.7082 × 10−27.3779 × 10−26.3217 × 10−1~N/A
f177.1071 × 10−111.8745 × 10−111.6408 × 10−101.4729 × 10−69.8877 × 10−1 7.2487 × 10−1 ~N/A
f181.4018 × 10−111.4018 × 10−111.4018 × 10−111.4018 × 10−11 6.1228 × 10−1 6.4699 × 10−11~N/A
f192.7567 × 10−65.3326 × 10−51.8183 × 10−6 4.8148 × 10−1 8.9917 × 10−1 4.6412 × 10−1 ~N/A
f201.4018 × 10−111.4018 × 10−111.8745 × 10−111.0328 × 10−102.5189 × 10−24.3218 × 10−7~N/A
f211.4018 × 10−111.4018 × 10−111.4018 × 10−11 3.3134 × 10−1 7.4745 × 10−36.9641 × 10−8~N/A
f221.4018 × 10−111.4018 × 10−111.4018 × 10−112.7539 × 10−113.7194 × 10−22.5021 × 10−11~N/A
f231.4018 × 10−111.4018 × 10−111.4018 × 10−119.4096 × 10−11 2.2599 × 10−1 2.5970 × 10−9~N/A
f241.4018 × 10−111.4018 × 10−117.8055 × 10−11 2.0014 × 10−1 1.8717 × 10−2 1.7206 × 10−1 ~N/A
f252.2729 × 10−111.3989 × 10−71.3643 × 10−10 4.4711 × 10−1 8.2178 × 10−1 3.1751 × 10−1 ~N/A
f261.4018 × 10−113.9843 × 10−93.0304 × 10−111.1106 × 10−72.8074 × 10−22.3679 × 10−10~N/A
f271.4018 × 10−119.4096 × 10−115.8443 × 10−102.1213 × 10−55.9218 × 10−42.2408 × 10−6~N/A
f281.4018 × 10−112.2729 × 10−111.4018 × 10−117.1825 × 10−52.0181 × 10−24.3379 × 10−9~N/A
f291.4018 × 10−111.4018 × 10−111.4018 × 10−111.4018 × 10−116.2370 × 10−47.8055 × 10−11~N/A
Avg.6.55175.72415.62073.62072.51382.54832.25204
Rank7654231

The bold data values in each row of the Table are the best ones in each pair compared with the EAOA approach.

Figure A1

The EAOA’s convergence output curves represented graphically and compared to those of the GA, AOA, SA, FPA, PSO, MFO, and SCA algorithms for the selected functions. The green line is the set background of the worst one; here, the green line is the GA method.

Figure A1 compares the convergence outcome curves of the EAOA with the ESCA [40], IMFO [35], AOA [41], PPSO [29], WOA [36], and PBA [33] algorithms for the selected functions. The Y coordinate axis represents the average of 25 runs of the best output of the algorithms thus far. The X coordinate shows the iteration in the generation of searching methods. It can be seen from the figure that the EAOA performance curve shows a faster convergence rate than the other algorithms. Furthermore, for another view of the evaluation results of the proposed approach, we applied the Wilcoxon signed-rank technique for ranking the outcomes. This test compares the pairwise algorithms’ results between the EAOA and other enhanced methods—e.g., PBA, WOA, PPSO, AOA, IFMO, and ESCA algorithms—under the Wilcoxon signed-rank test. Table A5 lists the results of comparison of the pairwise algorithms’ results between the EAOA and other algorithms when applying the Wilcoxon signed-rank test. The bold-highlighted results in Table A5 are the outcomes with < 0.05. It can be seen that most values have < 0.05, indicating that the optimization results of the EAOA are significantly different from those of the other algorithms. The average ranking value is 2.25204, and the lowest output of the EAOA is superior to that of the other algorithms. In general, it can be seen that the proposed EAOA can compete with some of the other popular algorithms.

4. Optimal WSN Node Coverage Based on EAOA

This section demonstrates how the EAOA algorithm can be used to deploy a WSN with the best node coverage possible, followed by a subsection covering the majority of the processing stages, analysis, and discussion of the findings.

4.1. Optimal Node Coverage Strategy

The feasible solution to the optimal node coverage problem is the deployment of each node with a limited sensing radius, where each sensor can only sense and find within its sensing radius. The finding within its sensing radius is a workable solution to the coverage optimization problem. Assuming that the sensing radius of all nodes is the same, and the sensing radius of the node , any point in the monitoring area is covered if it is located within the sensing radius of at least one sensor node. The monitoring area is divided into the coverage area and the blind spot. Any point in the coverage area is covered by at least one sensor node, and the blind spot complements the coverage area. Some applications need to monitor events with high accuracy. Any point in the coverage area must be at least within the sensing radius of nodes simultaneously; otherwise, it will be regarded as a blind spot, which we call double-coverage. The location-seeking process of nodes is abstracted as the process of implementing varied movement behaviors of the object group toward food or a specific site. The purpose of WSN coverage optimization utilizing the EAOA approach is to optimize the coverage of the target monitoring area by using a limited number of sensor nodes and optimizing their deployment locations. Let be the objective function of the WSN nodes’ coverage optimization; the coverage ratio, which is the maximum ratio of probability to the network’s deployed surface 2D WSN monitoring area, is used to determine the objective function for the optimization problem. The maxima are as follows according to Equation (4): where and are the coverage ratio of the WSN nodes and the probability of the target point reaching in the sensed node of the 2D monitoring network’s deployed area, respectively. Each individual object in the algorithm represents a coverage distribution, and the specific steps of the algorithm scheme for the coverage optimization are listed as follows: Step 1: Input parameters such as a number of nodes , perception radius , area of region , etc. Step 2: Set the parameters of population size N, the maximum number of iterations max_Iter, the density factor, and prey attraction, and randomly initialize the object’s positions using Equations (5)–(7). Step 3: Enhance the initializing population—the parameters of Equations (8)–(10), (14), and (15)—and calculate the objective function for initial coverage according to Equation (18). Step 4: Update the position of objects and the strategy according to Equation (17), and then compare them to select the best fitness value according to the objective function value. Step 5: Calculate the individual values of objects and retain the optimal solution of the global best. Step 6: Determine whether the end condition is reached; if yes, proceed to the next step; otherwise, return to Step 4. Step 7: The program ends and outputs the optimal fitness value and the object’s best location, representing the node’s optimal coverage rate outputs.

4.2. Analysis and Discussion of Results

The scenarios assuming that the WSN’s sensor nodes are deployed in a square monitoring area of W × L can be set to scenario areas, e.g., 40 m × 40 m, 80 m × 80 m, 100 m × 100 m, and 160 m × 160 m. Table 2 lists the experimental parameters of the WSN node deployment areas. The sensing radius of sensor nodes is set to 10 m. The communication radius is set to 20 m. The number of sensor nodes is denoted by M, consisting of 20, 40, 50, and 60 sensor nodes. Iter indicates the number of iterations, which may be set to 500, 1000, or 1500.
Table 2

The parameter settings for the desired WSN node deployment areas.

DescriptionParametersValues
Desired deployment areasW × L40 m × 40 m, 80 m × 80 m, 100 m × 100 m, 160 m × 160 m
Sensing radius Rs 15 m
Communication radius Rc 20 m
Number of sensor nodes M 20, 40, 50, 60
Number of iterations Iter 500, 1000, 1500
The optimal results of the EAOA were compared with the other selected schemes—i.e., the SSA [44], PSO [45], GWO [46], SCA [47], and AOA [48]—for the coverage optimization of WSN node deployment to verify the adequate performance of the algorithm. Figure 1 displays a graphical diagram of the nodes’ initialization with the EAOA for the statistical coverage optimization scheme with different numbers of sensor nodes: (a) 20, (b) 40, (c) 50, and (d) 60.
Figure 1

The graphical initialization of the EAOA with the statistical node coverage optimization scheme for different numbers of sensor nodes: (a) 20, (b) 40, (c) 50, and (d) 60.

Table 3 compares the proposed EAOA approach to other strategies—i.e., the SSA, PSO, GWO, SCA, and AOA algorithms—in terms of percentage coverage rate, running time, convergence iterations, and monitoring area size. It can be seen that the EAOA scheme produces the best global solution in the coverage areas, with a high coverage rate, coverage of the node’s whole area, and a faster runtime than the other approaches.
Table 3

Comparison of the proposed EAOA method with the other techniques used—i.e., the SAA, PSO, GWO, SCA, and AOA algorithms—in terms of percentage coverage rate, running time, iterations to convergence, and monitoring area size.

ApproachFactor Variables40 m × 40 m80 m × 80 m100 m × 100 m160 m × 160 m
SSACoverage rate (%)78%74%77%74%
Consumed execution time (s)3.09 6.917.38 9.34
No. of iterations to convergence145256234844
WSN node numbers 20405060
PSOCoverage rate (%)79%77%79%76%
Consumed execution time (s)2.78 6.22 6.658.41
No. of iterations to convergence396343578754
WSN node numbers 20405060
GWOCoverage rate (%)80%80%84%78%
Consumed execution time (s)3.06 6.847.319.25
No. of iterations to convergence33444544755
WSN node numbers 20405060
CSACoverage rate (%)78%79%82%78%
Consumed execution time (s)2.926.297.239.22
No. of iterations to convergence445555665876
No. of mobile nodes20405060
AOACoverage rate (%)80%79%80%79%
Consumed execution time (s)3.126.987.469.44
No. of iterations to convergence665333563954
WSN node numbers 20405060
EAOACoverage rate (%)80%82%87%80%
Consumed execution time (s)2.756.156.578.31
No. of iterations to convergence135503556765
WSN node numbers 20405060
Figure 2 indicates the graphical coverage of six different metaheuristic algorithms—i.e., the AOA, SSA, PSO, GWO, SCA, and EAOA approaches—for the WSN node area deployment scenarios for optimal coverage rates, with the same density and environmental setting conditions. Because the EAOA algorithm can avoid premature phenomena, its coverage rate is reasonably high, with less overlap. It can better alter the node configuration than the other competitors for the monitoring area’s network coverage. The graphics show the differences in the distribution of coverage; the differences are so small that the graphics look very similar, with the graph of node coverage distribution showing seemingly identical results. Furthermore, Figure 3 and Figure 4 show that the convergence curves of the proposed EAOA approach can provide higher percentages of statistical coverage than the other methods used.
Figure 2

The graphical coverage of six different metaheuristic algorithms for the WSN node area deployment. (a) AOA, (b) EAOA, (c) GWO, (d) PSO, (e) SSA, (f) SCA algorithms.

Figure 3

Comparison of the optimal coverage rates of the EAOA with the other schemes in different-sized WSN monitoring node area deployment scenarios. (a) 160 m × 160 m, (b) 100 m × 100, (c) 80 m × 80 m, and (d) 40 m × 40 m.

Figure 4

Comparison of the EAOA optimization coverage rates for various sensor node counts deployed in the 2D monitoring of a 100 m × 100 m area.

Figure 3 indicates four different sizes of WSN monitoring node area deployment scenarios of the metaheuristic approaches for optimal coverage rates. The convergence curves of the proposed EAOA approach can provide higher percentages of statistical coverage than the other methods used. Figure 4 shows the coverage rate of the EAOA compared against the SSA, PSO, GWO, SCA, and AOA algorithms for statistical sensor node count deployment for the 2D monitoring of different areas. It can be seen the EAOA algorithm produces a coverage rate that is reasonably high in the monitoring area’s network coverage. The results show that the EAOA approach provides a reasonably high coverage rate, with less overlap and better alteration of the sensor nodes’ configuration, compared to the average coverage rate under the same test conditions.

5. Conclusions

This paper suggests an enhanced Archimedes optimization algorithm (EAOA) to solve the wireless sensor network (WSN) nodes’ uneven distribution and low coverage issues in random deployment. Each divided sub-area of the monitoring area of the entire WSN was subjected to node coverage optimization based on the EAOA. The objective function of the optimal node coverage was modeled mathematically by calculating the distance between nodes by measuring each sensor node’s sensing radius and its communication capability in the deployed WSN. The optimization results of multiple sub-areas were fused, combining the sub-areas’ coverage with the complete network node coverage via a mapping mechanism. The updated equations of the EAOA were modified with reverse learning and multidirection strategies to avoid the original drawbacks of the AOA, e.g., slow convergence speed and ease of falling into local extrema whenever dealing with complicated situations. The compared results of the optimal findings on the selected benchmark functions and the WSN node coverage show that the proposed EAOA makes the optimal solution effective for both coverage and benchmark problems. The suggested algorithm will be applied in future works to address WSN node localization [49,50] and optimal WSN deployment [51,52].
  2 in total

1.  A Mahalanobis Surrogate-Assisted Ant Lion Optimization and Its Application in 3D Coverage of Wireless Sensor Networks.

Authors:  Zhi Li; Shu-Chuan Chu; Jeng-Shyang Pan; Pei Hu; Xingsi Xue
Journal:  Entropy (Basel)       Date:  2022-04-22       Impact factor: 2.738

2.  An Adaptation Multi-Group Quasi-Affine Transformation Evolutionary Algorithm for Global Optimization and Its Application in Node Localization in Wireless Sensor Networks.

Authors:  Nengxian Liu; Jeng-Shyang Pan; Jin Wang; Trong-The Nguyen
Journal:  Sensors (Basel)       Date:  2019-09-23       Impact factor: 3.576

  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.