| Literature DB >> 33569645 |
Abegail Santillan1,2, Rock Christian Tomas3, Ruth Bangaoil1,2,4, Rolando Lopez4,5, Maria Honolina Gomez4,5, Allan Fellizar1,2,6, Antonio Lim7, Lorenzo Abanilla7, Maria Cristina Ramos1,2,8, Leonardo Guevarra2,9, Pia Marie Albano10,11,12.
Abstract
The current gold standard in cancer diagnosis-the microscopic examination of hematoxylin and eosin (H&E)-stained biopsies-is prone to bias since it greatly relies on visual examination. Hence, there is a need to develop a more sensitive and specific method for diagnosing cancer. Here, Fourier transform infrared (FTIR) spectroscopy of thyroid tumors (n = 164; 76 malignant, 88 benign) was performed and five (5) neural network (NN) models were designed to discriminate the obtained spectral data. PCA-LDA was used as classical benchmark for comparison. Each NN model was evaluated using a stratified 10-fold cross-validation method to avoid overfitting, and the performance metrics-accuracy, area under the curve (AUC), positive predictive value (PPV), negative predictive value (NPV), specificity rate (SR), and recall rate (RR)-were averaged for comparison. All NN models were able to perform excellently as classifiers, and all were able to surpass the LDA model in terms of accuracy. Among the NN models, the RNN model performed best, having an AUC of 95.29% ± 6.08%, an accuracy of 98.06% ± 2.87%, a PPV of 98.57% ± 4.52%, a NPV of 93.18% ± 7.93%, a SR value of 98.89% ± 3.51%, and a RR value of 91.25% ± 10.29%. The RNN model outperformed the LDA model for all metrics except for the AUC, NPV, and RR. In conclusion, NN-based tools were able to predict thyroid cancer based on infrared spectroscopy of tissues with a high level of diagnostic performance in comparison to the gold standard.Entities:
Keywords: Diagnosis; Infrared spectroscopy; Neural networks; Pathologists; Scaled exponential linear units (SELU); Thyroid cancer
Mesh:
Year: 2021 PMID: 33569645 DOI: 10.1007/s00216-021-03183-0
Source DB: PubMed Journal: Anal Bioanal Chem ISSN: 1618-2642 Impact factor: 4.142