Literature DB >> 36248955

Computing the Entropy Measures for the Line Graphs of Some Chemical Networks.

Muhammad Farhan Hanif1, Hasan Mahmood1,2, Shazia Manzoor3, Fikre Bogale Petros4.   

Abstract

Chemical Graph entropy plays a significant role to measure the complexity of chemical structures. It has explicit chemical uses in chemistry, biology, and information sciences. A molecular structure of a compound consists of many atoms. Especially, the hydrocarbons is a chemical compound that consists of carbon and hydrogen atoms. In this article, we discussed the concept of subdivision of chemical graphs and their corresponding line chemical graphs. More preciously, we discuss the properties of chemical graph entropies and then constructed the chemical structures namely triangular benzenoid, hexagonal parallelogram, and zigzag edge coronoid fused with starphene. Also, we estimated the degree-based entropies with the help of line graphs of the subdivision of above mentioned chemical graphs.
Copyright © 2022 Muhammad Farhan Hanif et al.

Entities:  

Mesh:

Substances:

Year:  2022        PMID: 36248955      PMCID: PMC9560836          DOI: 10.1155/2022/2006574

Source DB:  PubMed          Journal:  Comput Intell Neurosci


1. Introduction

Mathematical chemistry is a field of theoretical chemistry that uses mathematical approaches to discuss molecular structure without necessarily referring to quantum mechanics [1]. Chemical Graph Theory is a branch of mathematical chemistry where a chemical phenomenon is theoretically described using graph theory [2, 3]. The growth of organic disciplines has been aided by Chemical Graph Theory [4, 5]. In mathematical chemistry, graph invariants or topological indices are numeric quantities that describe various essential features of organic components and are produced from an analogous molecular graph [6, 7]. Degree-based indices are among the topological indices used to predict bioactivity, boiling point, draining energy, stability, and physico-chemical properties of certain chemical compounds [8, 9]. Due to their chemical applications, these indices have significant role in theoretical chemistry. Zhang et al. [10-12] discuss the topological indices of generalized bridge molecular graphs, Carbon Nanotubes and product of chemical graphs. Zhang et al. [13-15] provided the physical analysis of heat for formation and entropy of Ceria Oxide. For further study about indices, see [16, 17].Shannon [18] originated the conception of information entropy in communication theory. However, it was later discovered as a quantity that applied to all things with a set nature [19, 20], including molecular graphs [21-23]. In chemistry, information entropy is now used in two modes. Firstly, it is a structural descriptor for assessing the complexity of chemical structures [24]. Information entropy is useful in this regard for connecting structural and physico-chemical features [25], numerically distinguishing isomers of organic molecules [26], and classifying natural products and synthetic chemicals [27, 28]. The physico-chemical sounding of information entropy is a different mode of application. As a result, Terenteva and Kobozev demonstrated its utility in analyzing physico-chemical processes that simulate information transmission [29]. Zhdanov [30] used entropy values to study organic compound chemical processes. The information entropy is defined as:Here, the logarithm is considered to be with base e while ℱ , ℱ and Λ(lm) represent the vertex set, the edge set and the edge weight of the edge (lm) in Λ. Many graph entropies have been calculated in the literature utilising characteristic polynomials, vertices degree, and graph order [31-34]. Graph entropies, which are based on independent sets, matchings, and the degree of vertices [35], have been estimated in recent years. Dehmer and Mowshowits proposed several graph complexity and Hosoya entropy relationships [23, 32, 36, 37]. For further study, see [19, 21, 38–42, 59, 60].The graph ℱ is structured into ordered pairs, with one object being referred to as a vertex set (ℱ) and the other as an edge set (ℱ), and these vertices and edges being connected. When two vertices of ℱ share an edge, they are said to be neighboring. The sum of the degrees of all neighboring vertices of l is denoted by A, and the degree of a vertex l is represented by . By replacing each of S(ℱ)'s edges with a path of length two, the subdivision graph S(ℱ) is formed. The line graph is denoted by the symbol L(ℱ) in which |V(L(ℱ))| = |E(ℱ)| and two vertices of L(ℱ) are adjacent iff their corresponding edges share a common end points in ℱ.

1.1. Randić Entropy [43, 44]

If , then Now (1) represent the Randi Entropy.

1.2. Atom Bond Connectivity Entropy [45]

If , then Thus (1) is converted in the following form:

1.3. The Geometric Arithmetic Entropy [43, 44]

If , then Now (1) takes the form as given below.

1.4. The Fourth Atom Bond Connectivity Entropy [35]

If , then Now (1) converted in the following form as:

1.5. The Fifth Geometric Arithmetic Entropy [35]

If , then Equation (1) is now changed to the following form, which is known as fifth geometric arithmetic entropy. See [35, 44] for further information on these entropy measures.

2. Formation of Triangular Benzenoid T∀ x ∈ ℕ

Triangular benzenoids are a group of benzenoid molecular graphs and are denoted by T, where x characterizes the number of hexagons at the bottom of the graph and 1/2x(x+1) represents the total number of hexagons in T. Triangular benzenoids are a generalization of the benzene molecule C6H6, with benzene rings forming a triangular shape. In physics, chemistry, and nanosciences, the benzene molecule is a common molecule. Synthesizing aromatic chemicals is quite fruitful [46]. Raut [47] calculated some toplogical indices for the triangular benzenoid system. Hussain et al. [48] discussed the irregularity determinants of some benzenoid systems. Kwun [49] calculated degree-based indices by using M polynomials. For further details, see [50, 51]. The hexagons are placed in rows, with each row increasing by one hexagon. For T1, there are only one type of edges e1=(2,2) and |e1|=6. Therefore, V(T1)=6 and E(T1)=6 while three kinds of edges are there in T2 e.g. e1=(2,2), e2=(2,3), e3=(3,3) and |e1|=6, |e2|=6, |e3|=3. Therefore, V(T − 2)=13 and E(T2)=15. Continuing in this way, |V(T)|=x2+4x+1 and |E(T)|=3/2x(x+3). The subdivision graph of T and its line graph are demonstrated in Figure 1. It is to be noted that |V(L(S(T)))|=3x(x+3) and |E(L(S(T)))|=3/2(3x2+7x − 2).
Figure 1

(a) Triangular benzenoid T5, (b) Subdivision of T5,  (c) The line graph of subdivision graph of T5.

Let ℱ=L(S(T)). i-e. ℱ is the line graph of the subdivision graph of triangular benzenoid T. We will use the edge partition and vertices counting technique to compute our abstracted indices and entropies. The degree of each edge's terminal vertices is used in the edge partitioning of ℱ. It is easy to see that there are only three types of edges shown in Table 1.
Table 1

Edge partition of L(S(T)).

l,m N i Set of Edges
(2,2)2(x+3) E 1
(2,3)6(x − 1) E 2
(3,3)3/2(3x2+x − 4) E 3

2.1. Entropy Measure for L(S(T))

We'll calculate the entropies of ℱ=L(S(T)) in this section.

2.1.1. Randi Entropy of L(S(T))

The Randi index and entropy for α=1, −1, 1/2, −1/2, with the help of Table 1, and equation (3) is: By putting α=1, −1, 1/2, −1/2, in (3), we get the Randi entropies as given below:

2.1.2. The ABC Entropy of L(S(T))

The ABC index and entropy measure with the help of Table 1 and equation (5) is:

2.1.3. The Geometric Arithmetic Entropy of L(S(T))

The GA index and entropy measure with the help of Table 1 and equation (7) is:

2.1.4. The ABC4 Entropy of L(S(T))

The edge partition of the graph L(S(T)) is grounded on the degree addition of terminal vertices of every edge, as shown in Table 2.
Table 2

Edge partition of L(S(T)).

(Al, Am) N i Set of Edges
(4, 4)9 E 1
(4, 5)6 E 2
(5, 5)3(x − 2) E 3
(5, 8)6(y − 1) E 4
(8, 8)3(x − 1) E 5
(8, 9)6(x − 1) E 6
(9, 9)3/2(3x2+2 − 5x) E 7
After simple calculations, by using Table 2 subject to the condition that x ≠ 1, we get By using (9), the ABC4 entropy as follows: If we consider x=1, Then , and ENT(ℱ)=2.1972.

2.1.5. The GA5 Entropy of L(S(T))

After some simple calculations, the GA5 index may be calculated using Table 2 under the constraint that x ≠ 1. Therefore, (11), with Table 2 converted in the form:

3. Formation of Hexagonal Parallelogram Nanotubes H(x, y), ∀x, y ∈ ℕ

Hexagonal parallelogram nanotubes are formed by arranging hexagons in a parallelogram fashion. Baig et al. [52] computed counting polynomials of benzoid carbon nanotubes. Also, see [53]. We will denote this structure by H(x, y)∀ x, y ∈ ℕ, in which x and y represent the quantity of hexagons in any row and column respectively. Also, the order and size of H(x, y) is 2(x+y+xy) and 3xy+2x+2y − 1 respectively. The subdivision graph of H(x, y) and its line graph is shown in Figure 2, see [46]. Let ℱ=L(S(H(x, y))), then |ℱ|=2(3xy+2x+2y − 1) and |ℱ|=9xy+4x+4y − 5. To compute our results, we will use edge partition technique which is grounded on the degree of terminal vertices of every edge. It is to be noted that there are only three types of edges, see Figure 2. The edge partition of chemical graph L(S(H(x, y))) depending on the degree of terminal vertices is presented in Table 3.
Figure 2

(a) Hexagonal parallelogram H(x, y), (b) Subdivision of H(x, y),  (c) The line graph of subdivision graph of H(x, y).

Table 3

Edge partition of L(S(H(x, y))).

l,m N i Kinds of Edges
(2, 2)2(4+y+x) E 1
(2, 3)4(−2+y+x) E 2
(3, 3)9xy − 2m − 2n − 5 E 3

3.1. Entropy Measure for L(S(H(x, y)))

We will enumerate the entropies of ℱ=L(S(H(x, y))) in this section.

3.1.1. Randić Entropy of ℱ

The Randi index for α=1, −1, 1/2, −1/2, by using Table 3 is: So the (3) with Table 3 gives the Randi entropy and is converted in the form: Now substitute α=1, −1, 1/2, −1/2, in (20), we get the Randi entropies as given below:

3.1.2. The ABC Entropy of ℱ

With the use of Table 3 and equation (5), we can calculate the ABC index and entropy measure as follows: Therefore, the equation (5), with Table 3 becomes as following and is called the atom bond connectivity entropy.

3.1.3. The Geometric Arithmetic Entropy of ℱ

We can calculate the GA index and entropy measure using Table 3 and equation (7) as follows:

3.1.4. The ABC4 Entropy of ℱ

Case 1 .

when x > 1, y ≠ 1 The edge partition of L(S(H(x, y))) is shown in Table 4.
Table 4

Edge partition of L(S(H(x, y))).

(Al, Am) N i Kinds of edges
(4, 4)8 E 1
(4, 5)8 E 2
(5, 5)2(−4+y+x) E 3
(5, 8)4(−2+y+x) E 4
(8, 8)2(−2+x+y) E 5
(8, 9)2(−2+x+y) E 6
(9, 9)9xy − 8x − 8y+7 E 7
Therefore, the ABC4 index and entropy measure with the help of Table 4 and equation (9) yield as: Since ℱ has seven kinds of edges, So (9) by using Table 4 is converted in the form:

Case 2 .

when x=1, y ≠ 1 By using the same process, we get the closed expressions for the ABC4 index and ABC4 entropy as:

3.1.5. The Fifth Geometric Arithmetic Entropy of ℱ

Case 3 .

when x > 1, y ≠ 1The fifth geometric arithmetic entropy can be estimated by using (11), and Table 4 in the following manner: So the (11), with Table 4 can be written as:

Case 4 .

when x=1, y ≠ 1By using Table 5 and using (11) we get the closed expressions for the GA5 index and GA5 entropy as:
Table 5

Edge partition of L(S(H(x, y))), for x=1.

(Al, Am) N i Kinds of edges
(4, 4)10 E 1
(4, 5)4 E 2
(5, 5)2(y − 2) E 3
(5, 8)4(y − 1) E 4
(8, 8)2(y − 1) E 5
(8, 9)2(y − 1) E 6
(9, 9) y − 1 E 7

4. Formation from Fusion of Zigzag-Edge Coronoid with Starphene ZCS(x, y, z) Nanotubes

If a zigzag-edge coronoid ZC(x, y, z) is fused with a starphene St(x, y, z), then we will obtain a composite benzenoid. It is to b noted that |V(ZCS(x, y, z))|=36x − 54 and |E(ZCS(x, y, z))|=−63+15(z+y+x). The subdivision graph of ZCS(x, y, z) and its line graph are illustrated in Figure 3. We can see from figures that the order and the size in the line graph of the subdivision graph of ZCS(x, y, z) are −126+30(z+y+x) and −153+39(z+y+x) respectively [46]. Let ℱ represents the subdivision graph of ZCS(x, y, z)'s line graph. The edge division is determined by the degree of each edge's terminal vertices. Table 6 illustrates this.
Figure 3

(a) ZCS(4,4,4), (b) subdivision of ZCS(4,4,4),  (c) L(S(ZCS(4,4,4))).

Table 6

Edge partition of L(S(ZCS)).

l,m N i Kinds of Edges
(2,2)6(−5+z+y+x) E 1
(2,3)12(−7+z+y+x) E 2
(3,3)−39+21(z+y+x) E 3

4.1. Entropy Measure for L(S(ZCS(x, y, z)))

We'll calculate the entropies of ℱ=L(S(ZCS(x, y, z))) in this section.

4.1.1. Randi Entropy of ℱ

For α=1, −1, 1/2, −1/2, the Randi index with the help of Table 1 is Using (3) Randi entropy is: By putting α=1, −1, 1/2, −1/2, in (32), we get the Randi entropies as given below:

4.1.2. The ABC Entropy of ℱ

The ABC index and entropy measure with the help of Table 6 and equation (5) are:

4.1.3. The Geometric Arithmetic Entropy of ℱ

The GA index and corresponding entropy with the help of Table 6 and equation (7) are:

4.1.4. The ABC4 entropy of ℱ

Table 7 shows the graph L(S(ZCS(x, y, z)))'s edge partition, which is based on the degree addition of each edge's terminal vertices.
Table 7

Edge partition of L(S(ZCS(x, y, z))) established on degree sum of terminal vertices, for every x=y=z ≥ 4

(Al, Am) N i Kinds of Edges
(4, 4)6 E 1
(4, 5)12 E 2
(5, 5)6(x+y+z − 8) E 3
(5, 8)12(x+y+z − 7) E 4
(8, 8)6(x+y+z − 9) E 5
(8, 9)12(x+y+z − 5) E 6
(9, 9)3(x+y+z+25) E 7
After simple calculations, the ABC4 index and entropy measure with the help of Table 7 and equation (9) subject to the condition that x=y=z ≥ 41

4.1.5. The GA5 Entropy of ℱ

After some simple calculations, the GA5 index and corresponding entropy measure with the help of Table 7 and equation (11) subject to the condition that x=y=z ≥ 4.

5. Concluding remarks for Computed Results

The applications of information-theoretic framework in many disciplines of study, such as biology, physics, engineering, and social sciences, have grown exponentially in the recent two decades. This phenomenal increase has been particularly impressive in the fields of soft computing, molecular biology, and information technology. As a result, the scientists may find our numerical and graphical results useful [54, 55]. The entropy function is monotonic, which means that as the size of a chemical structure increases, so does the entropy measure, and as the entropy of a system increases, so does the uncertainty regarding its reaction. For L(S(T)), the numerical and graphical results are shown in Tables 8 and 9 and Figures 4–7. In Table 9, the fifth arithmetic geometric entropy is zero which shows that the process is deterministic for x=1. When the chemical structure L(S(T)) expands, the Randi entropy for α=1/2 develops more quickly than other entropy measurements of L(S(T)), whereas the Randi entropy for α=−1/2 develops more slowly. This demonstrates that different topologies have varied entropy characteristics. For L(S(H(x, y))), the numerical and graphical results are shown in Tables 10–13 and Figures 8–12. When the chemical structure L(S(H(x, y))) expands, the geometric arithmetic entropy develops more quickly than other entropy measurements of L(S(H(x, y))), whereas the ABC4 entropy develops more slowly. Finally, for L(S(ZCS(x, y, z))), the numerical and graphical results are shown in Table 14 and Figures 13–16. When the chemical structure L(S(ZCS(x, y, z))) expands, the geometric arithmetic entropy develops more quickly than other entropy measurements of L(S(ZCS(x, y, z))), whereas the Randi entropy for α=−1 develops more slowly.
Table 8

Comparison of randic entropies for L(S(T)).

[x] ENT R 1 ENT R −1 ENT R 1/2 ENT R −1/2
[46]0.40552.55902.48492.6263
[52]3.18633.04633.56673.5970
[25]4.03163.67674.22034.2280
[26]4.57974.29284.69814.6991
[24]4.99454.87145.07795.0764
[23]5.33125.41075.39425.3918
[27]5.61595.91315.66585.6631
[2]5.86326.38205.90415.9013
[56]6.08206.82086.11646.1136
[31]6.27857.23256.30806.3053
Table 9

Comparison of ENT, ENT, ENT, and ENT for L(S(T)).

[x] ENT ABC ENT GA ENT ABC 4 ENT GA 5
[46]2.31162.48492.19720
[52]3.52393.58353.57493.5835
[25]4.20254.23414.22634.2341
[26]4.68974.70954.70284.7095
[24]5.07395.08765.08175.0876
[23]5.39265.40275.39755.4026
[27]5.66555.67335.66875.6733
[2]5.90465.910875.90665.9108
[56]6.11746.12256.11876.1225
[31]6.30936.31356.31006.3135
Figure 4

(a) R1 entropy, (b) R−1 entropy.

Figure 5

(a) R1/2 entropy, (b) R−1/2 entropy.

Figure 6

(a) The ABC entropy, (b) The GA entropy.

Figure 7

(a) The ABC4 entropy, (b) The GA5, entropy.

Table 10

Comparison of randic entropies for L(S(H(x, y))).

[x, y] ENT R 1 ENT R −1 ENT R 1/2 ENT R −1/2
[1,1]2.48492.48492.48492.4849
[2,2]3.79173.78303.83443.8332
[3,3]4.56354.54284.59334.5906
[4,4]5.10965.08725.13235.1294
[5,5]5.53455.51295.55305.5502
[6,6]5.88335.86305.89885.8962
[7,7]6.17946.16156.19286.1904
[8,8]6.43686.41946.44866.4464
[9,9]6.66466.64836.67516.6731
[10,10]6.86886.53706.87836.8822
Table 11

Comparison of ENT and ENT entropies for L(S(H(x, y))).

[x, y] ENT ABC ENT GA
[1,1]2.48492.4849
[2,2]3.84973.8501
[3,3]4.60484.6051
[4,4]5.14135.1416
[5,5]5.56045.5607
[6,6]5.90515.9053
[7,7]6.19826.1985
[8,8]6.45346.4536
[9,9]6.67946.6796
[10,10]6.88226.8824
Table 12

Comparison of ENT and ENT Entropies for L(S(H(x, y))), x > 1 and y ≠ 1.

[x, y] ENT ABC 4 ENT GA 5
[2,2]3.78793.4822
[3,3]4.53872.2596
[4,4]5.07834.8387
[5,5]5.50185.2952
[6,6]5.85095.6704
[7,7]6.14815.9882
[8,8]6.40686.2636
[9,9]6.63606.5064
[10,10]6.84176.7234
Table 13

Comparison of ENT and ENT entropies for L(S(H(x, y))), x=1 and y ≠ 1.

[y] ENT ABC 4 ENT GA 5
[52]3.18463.2958
[25]3.59333.6888
[26]3.88843.9702
[24]4.11844.1896
[23]4.30644.3694
[27]4.46534.5217
[2]4.60274.6539
[56]4.72384.7706
[31]4.83194.8751
Figure 8

(a)R1 entropy, (b)R−1 entropy.

Figure 9

(a)R1/2 entropy, (b)R−1/2 entropy.

Figure 10

(a) The ABC entropy, (b) The GA, entropy.

Figure 11

(a) The ABC4 entropy, (b) The GA5 entropy, x ≥ 1, y ≠ 1.

Figure 12

(a) The ABC4 entropy, (b) The GA5 entropy x=1, y ≠ 1.

Table 14

Comparison of randic entropies for L(S(ZCS(x, y, z))).

[x, y, z] ENT R 1 ENT R −1 ENT R 1/2 ENT R −1/2
[4,4,4]5.72005.700605.74325.7407
[5,5,5]6.03426.01656.05876.0565
[6,6,6]6.27306.25646.29826.2961
[7,7,7]6.46576.44976.49136.4893
[8,8,8]6.62726.61176.65316.6511
[9,9,9]6.76626.75116.79236.7904
[10,10,10]6.88836.87346.91456.9126
Figure 13

(a)R1 entropy, (b)R−1 entropy.

Figure 14

(a)R1/2 entropy, (b)R1/2 entropy.

Figure 15

(a)ABC entropy, (b)The GA, entropy.

Figure 16

(a)ABC4 entropy, (b)GA5 entropy.

The novelty of this article is that entropies are computed for three types of benzenoid systems. These entropy measures are useful in estimating the heat of formation and many Physico-chemical properties. In statistical analysis of benzene structures, entropy measures showed more significant results as compared to topological indices. Therefore, we can say that the entropy measure is a newly introduced topological descriptor.

6. Conclusion

Using Shanon's entropy and Chen et al. [31] entropy definitions, we generated graph entropies associated to a new information function in this research. Between indices and information entropies, a relationship is created. Using the line graph of the subdivision of these graphs, we estimated the entropies for triangular benzenoids T, hexagonal parallelogram H(x, y) nanotubes, and ZCS(x, y, z). Thermodynamic entropy of enzyme-substrate complexions [57, 58] and configuration entropy of glass-forming liquids [56] are two examples of thermodynamic entropy employed in molecular dynamics studies of complex chemical systems. Similarly, using information entropy as a crucial structural criterion could be a new step in this direction.
  10 in total

1.  Quantitative methods for ecological network analysis.

Authors:  Robert E Ulanowicz
Journal:  Comput Biol Chem       Date:  2004-12       Impact factor: 2.877

2.  Generalized walks-based centrality measures for complex biological networks.

Authors:  Ernesto Estrada
Journal:  J Theor Biol       Date:  2010-01-18       Impact factor: 2.691

3.  On entropy-based molecular descriptors: statistical analysis of real and synthetic chemical structures.

Authors:  Matthias Dehmer; Kurt Varmuza; Stephan Borgert; Frank Emmert-Streib
Journal:  J Chem Inf Model       Date:  2009-07       Impact factor: 4.956

4.  Entropy and the complexity of graphs. I. An index of the relative complexity of a graph.

Authors:  A Mowshowitz
Journal:  Bull Math Biophys       Date:  1968-03

5.  Information entropy-based classification of triterpenoids and steroids from Ganoderma.

Authors:  Gloria Castellano; Francisco Torrens
Journal:  Phytochemistry       Date:  2015-05-26       Impact factor: 4.072

6.  Physical Analysis of Heat for Formation and Entropy of Ceria Oxide Using Topological Indices.

Authors:  Xiujun Zhang; Muhammad Kamran Siddiqui; Sana Javed; Lubna Sherin; Farah Kausar; Mehwish Hussain Muhammad
Journal:  Comb Chem High Throughput Screen       Date:  2022       Impact factor: 1.339

7.  The sample size effect in metallic glass deformation.

Authors:  Yannick Champion; Nicolas Thurieau
Journal:  Sci Rep       Date:  2020-07-01       Impact factor: 4.379

8.  The Cartesian Product and Join Graphs on Edge-Version Atom-Bond Connectivity and Geometric Arithmetic Indices.

Authors:  Xiujun Zhang; Huiqin Jiang; Jia-Bao Liu; Zehui Shao
Journal:  Molecules       Date:  2018-07-16       Impact factor: 4.411

9.  Molecular dynamics simulations give insight into D-glucose dioxidation at C2 and C3 by Agaricus meleagris pyranose dehydrogenase.

Authors:  Michael M H Graf; Urban Bren; Dietmar Haltrich; Chris Oostenbrink
Journal:  J Comput Aided Mol Des       Date:  2013-04-17       Impact factor: 3.686

10.  Topological Characterization of Carbon Graphite and Crystal Cubic Carbon Structures.

Authors:  Wei Gao Muhammad Kamran Siddiqui; Muhammad Naeem; Najma Abdul Rehman
Journal:  Molecules       Date:  2017-09-07       Impact factor: 4.411

  10 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.