| Literature DB >> 34091096 |
Yizhe Li1, Cheng Chen2, Fangfang Chen1, Chen Chen1, Rui Gao1, Bo Yang1, Rumeng Si1, Xiaoyi Lv3.
Abstract
Hyperthyroidism and hypothyroidism may cause a series of clinical complications have a high incidence, and early diagnosis is beneficial to treatment. Based on Raman spectroscopy and deep learning algorithms, we propose a rapid screening method to distinguish serum samples of hyperthyroidism patients, hypothyroidism patients and control subjects. We collected 99 serum samples, including 38 cases from hyperthyroidism patients, 32 cases from hypothyroidism patients and 29 cases from control subjects. By comparing and analyzing the Raman spectra of the three, we found differences in the peak intensity of the spectra, indicating that Raman spectra can be used for the subsequent identification of diseases. After collecting the spectral data, Vancouver Raman algorithm (VRA) was used to remove the fluorescence background of the data, and kernel principal component analysis (KPCA) was used to extract the spectral data features with a cumulative explained variance ratio of 0.9999. Then, five neural network models, the adjusted AlexNet, LSTM-CNN, IndRNNCNN, the adjusted GoogLeNet and the adjusted ResNet, were constructed for classifications. The total accuracy was 91%, 84%, 82%, 75% and 71% respectively. The results of our study show that it is feasible to use Raman spectroscopy combined with deep learning to distinguish hyperthyroidism, hypothyroidism and control subjects. After comparing the models, we found that as the neural network deepens and the complexity of the model increases, the classification effect of Raman spectroscopy gradually deteriorates, and we put forward three conjectures for this.Entities:
Keywords: Raman spectroscopy; deep learning; hyperthyroidism; hypothyroidism; kernel principal component analysis (KPCA); vancouver Raman algorithm (VRA)
Year: 2021 PMID: 34091096 DOI: 10.1016/j.pdpdt.2021.102382
Source DB: PubMed Journal: Photodiagnosis Photodyn Ther ISSN: 1572-1000 Impact factor: 3.631