| Literature DB >> 31780879 |
Lingyu Xu1,2, Xiulin Geng2, Xiaoyu He2, Jun Li3,4, Jie Yu2.
Abstract
This study aims to explore the possibility of using a multilayer artificial neural network for the classification between children with autism spectrum disorder (ASD) and typically developing (TD) children based on short-time spontaneous hemodynamic fluctuations. Spontaneous hemodynamic fluctuations were collected by a functional near-infrared spectroscopy setup from bilateral inferior frontal gyrus and temporal cortex in 25 children with ASD and 22 TD children. To perform feature extraction and classification, a multilayer neural network called CGRNN was used which combined a convolution neural network (CNN) and a gate recurrent unit (GRU), since CGRNN has a strong ability in finding characteristic features and acquiring intrinsic relationship in time series. For the training and predicting, short-time (7 s) time-series raw functional near-infrared spectroscopy (fNIRS) signals were used as the input of the network. To avoid the over-fitting problem and effectively extract useful differentiation features from a sample with a very limited size (e.g., 25 ASDs and 22 TDs), a sliding window approach was utilized in which the initially recorded long-time (e.g., 480 s) time-series was divided into many partially overlapped short-time (7 s) sequences. By using this combined deep-learning network, a high accurate classification between ASD and TD could be achieved even with a single optical channel, e.g., 92.2% accuracy, 85.0% sensitivity, and 99.4% specificity. This result implies that the multilayer neural network CGRNN can identify characteristic features associated with ASD even in a short-time spontaneous hemodynamic fluctuation from a single optical channel, and second, the CGRNN can provide highly accurate prediction in ASD.Entities:
Keywords: ASD; CGRNN model; fNIRS; neural network; time series
Year: 2019 PMID: 31780879 PMCID: PMC6856557 DOI: 10.3389/fnins.2019.01120
Source DB: PubMed Journal: Front Neurosci ISSN: 1662-453X Impact factor: 4.677
FIGURE 1The source-detector configuration where the yellow circles indicate the sources, the green circles indicate the detectors, and the white square between a source and a detector is a channel (A). Location of fNIRS measurement channels over the inferior frontal and temporal cortex (B).
FIGURE 2The process of the time-series data.
FIGURE 3The set of the sliding window (w) and step size (s).
FIGURE 4CGRNN flow.
FIGURE 5(A–C) Show the prediction of 44 channels in HbO2, Hb, and HbT three attributes.
FIGURE 6The display diagram of the top 10 channels with corresponding properties that have the best classification effect. Ten sets of data are sorted from large to small by accuracy.
FIGURE 7CGRNN classification effect. (A) The predictive distribution of test data. (B) The accuracy of sequences diagnosis.
Accuracy of different classification models.
| Accuracy | 61.5% | 65.0% | 80.2% | 81.2% | 92.2% |
FIGURE 8The performance of 44 channels.
FIGURE 9ROC comparison for different models.
FIGURE 10The distribution of best classification performance (i.e., accuracy >80.0%) channels (blue number) based on HbO2 (A) and Hb (B) attributes in the left and right brain regions. The yellow area indicates the frontal lobe. The rose-red area represents the temporal lobe.