| Literature DB >> 35480141 |
D N V S L S Indira1, Rajendra Kumar Ganiya2, P Ashok Babu3, A Jasmine Xavier4, L Kavisankar5, S Hemalatha6, V Senthilkumar7, T Kavitha8, A Rajaram9, Karthik Annam3, Alazar Yeshitla10.
Abstract
Brain cancer is one of the cell synthesis diseases. Brain cancer cells are analyzed for patient diagnosis. Due to this composite cell, the conceptual classifications differ from each and every brain cancer investigation. In the gene test, patient prognosis is identified based on individual biocell appearance. Classification of advanced artificial neural network subtypes attains improved performance compared to previous enhanced artificial neural network (EANN) biocell subtype investigation. In this research, the proposed features are selected based on improved gene expression programming (IGEP) with modified brute force algorithm. Then, the maximum and minimum term survivals are classified by using PCA with enhanced artificial neural network (EANN). In this, the improved gene expression programming (IGEP) effectual features are selected by using remainder performance to improve the prognosis efficiency. This system is estimated by using the Cancer Genome Atlas (CGA) dataset. Simulation outputs present improved gene expression programming (IGEP) with modified brute force algorithm which achieves accurate efficiency of 96.37%, specificity of 96.37%, sensitivity of 98.37%, precision of 78.78%, F-measure of 80.22%, and recall of 64.29% when compared to generalized regression neural network (GRNN), improved extreme learning machine (IELM) with minimum redundancy maximum relevance (MRMR) method, and support vector machine (SVM).Entities:
Mesh:
Year: 2022 PMID: 35480141 PMCID: PMC9038414 DOI: 10.1155/2022/7799812
Source DB: PubMed Journal: Biomed Res Int Impact factor: 3.411
Comparison table for existing methods.
| Methods | Pros | Cons |
|---|---|---|
| Logistic regression | Good performance for small-scale datasets and outputs produces for the interpreted probability | Assumptions about the data are required and provide the linear solutions |
| KNN | Intuitive methods | No. of neighbors are user-defined-based |
| SVMs | Provides the solutions for the nonlinear applications | Requires the knowledge using the kernel employment |
| Decision tree | Able to handle the categorical features and tune for the few parameters and performs well for huge number of features | Ensemble of interpretability is questioned |
Figure 1Static order dataset predicting dynamic order brain cancer cell.
Dataset.
| Total population of primary |
|
| Valid population |
|
| Gender | Males 164, females 112 |
| Long-time term survivors | 171 |
| Short-time term survivors | 40 |
| Median age at diagnosis | 59 |
| Median KPS | 80 |
| Median survival | 386 days |
Statistical dataset properties.
| Data category | Number | Feature number |
|---|---|---|
| Gene methylation | 12440 | 53 |
| Gene expression | 17931 | 75 |
| Copy number | 16133 | 7 |
| miRNA expression | 534 | 5 |
Figure 2Confusion matrix for artificial binomial terms.
Figure 3Enhanced artificial neural network.
Figure 4Long- and short-time survivors class prediction by using the Kaplan-Meier curve of brain cancer prognosis prediction.
Figure 5Area under the curve performance.
Evaluation of long-time survivors and short-time survivors.
| Years | Long-time survivors | Short-time survivors |
|---|---|---|
| 0 | 5.5 | 2.4 |
| 1000 | 4.3 | 2 |
| 2000 | 4.4 | 1.8 |
| 3000 | 4.5 | 1.8 |
| 4000 | 5 | 1.8 |
Comparison between presented scheme and existing scheme.
| % | IGEP+EANN | FO+GRNN | LDA+IELM | MRMR+SVM |
|---|---|---|---|---|
| 0 | 3 | 2 | 1 | 0.4 |
| 0.2 | 3.5 | 2.5 | 2 | 1 |
| 0.4 | 4.5 | 3 | 2.5 | 1.5 |
| 0.6 | 5.5 | 3.5 | 2.9 | 1.9 |
| 0.8 | 7 | 4 | 3 | 2.2 |
Accuracy comparison among all feature selection and classification schemes.
| % | IGEP+EANN | FO+GRNN | LDA+IELM | MRMR+SVM |
|---|---|---|---|---|
| 20 | 91 | 89 | 85 | 81 |
| 40 | 92 | 90 | 87 | 82 |
| 60 | 93 | 91 | 89 | 83 |
| 80 | 94 | 92 | 90 | 84 |
| 100 | 95 | 93 | 91 | 85 |
| 120 | 96.37 | 94.36 | 92.35 | 88.22 |
Figure 6Accuracy performance comparison.
Figure 7Sensitivity performance comparison.
Sensitivity comparison among all feature selection and classification schemes.
| % | IGEP+EANN | FO+GRNN | LDE+IELM | MRMR+SVM |
|---|---|---|---|---|
| 20 | 65 | 70 | 86 | 91 |
| 40 | 70 | 75 | 89 | 94 |
| 60 | 79 | 85 | 90 | 95 |
| 80 | 81 | 88 | 92 | 96 |
| 100 | 85 | 90 | 95 | 97 |
| 120 | 98.37 | 96.36 | 93.35 | 96.23 |
Figure 8Specificity performance comparison.
Specificity comparison among all feature selection and classification schemes.
| % | IGEP+EANN | FO+GRNN | LDE+IELM | MRMR+SVM |
|---|---|---|---|---|
| 20 | 91.2 | 82 | 75 | 60 |
| 40 | 92.5 | 85 | 79 | 65 |
| 60 | 93.2 | 88 | 80 | 75 |
| 80 | 94.3 | 90 | 85 | 80 |
| 100 | 95.41 | 92 | 90 | 85 |
| 120 | 96.37 | 94.36 | 92.35 | 89.95 |
Figure 9Precision performance comparison.
Precision comparison among all feature selection and classification schemes.
| % | IGEP+EANN | FO+GRNN | LDE+IELM | MRMR+SVM |
|---|---|---|---|---|
| 20 | 73 | 69 | 30 | 20 |
| 40 | 72 | 70 | 40 | 35 |
| 60 | 74 | 72 | 50 | 45 |
| 80 | 75 | 74 | 60 | 55 |
| 100 | 77 | 75 | 74 | 72 |
| 120 | 78.78 | 77.78 | 76.77 | 70.85 |
Figure 10Recall performance comparison.
Recall comparison among all feature selection and classification schemes.
| % | IGEP+EANN | FO+GRNN | LDE+IELM | MRMR+SVM |
|---|---|---|---|---|
| 20 | 59 | 50 | 40 | 20 |
| 40 | 60 | 53 | 43 | 30 |
| 60 | 61 | 55 | 53 | 45 |
| 80 | 62 | 59 | 55 | 50 |
| 100 | 63.3 | 60 | 59 | 55 |
| 120 | 64.29 | 62.27 | 60.25 | 56.89 |
Mathematical evaluation of overall performance for each DA detection schemes.
| Performance metrics | IGEP+EANN | FO+GRNN | LDA+IELM | MRMR+SVM |
|---|---|---|---|---|
| Accuracy | 96.37 | 94.36 | 92.35 | 88.22 |
| Sensitivity | 98.37 | 96.36 | 93.35 | 96.23 |
| Specificity | 96.37 | 94.36 | 92.35 | 89.95 |
| Precision | 78.78 | 77.78 | 76.77 | 70.85 |
| Recall | 64.29 | 62.27 | 60.25 | 56.89 |
|
| 80.22 | 72.27 | 69.25 | 65.35 |
Figure 11Overall performance prediction.