| Literature DB >> 32290633 |
Yulia Shichkina1, Elizaveta Stanevich1, Yulia Irishina2.
Abstract
Parkinson's disease (PD) is one of the most common chronic neurological diseases and one of the significant causes of disability for middle-aged and elderly people. Monitoring the patient's condition and its compliance is the key to the success of the correction of the main clinical manifestations of PD, including the almost inevitable modification of the clinical picture of the disease against the background of prolonged dopaminergic therapy. In this article, we proposed an approach to assessing the condition of patients with PD using deep recurrent neural networks, trained on data measured using mobile phones. The data was received in two modes: background (data from the phone's sensors) and interactive (data directly entered by the user). For the classification of the patient's condition, we built various models of the neural network. Testing of these models showed that the most efficient was a recurrent network with two layers. The results of the experiment show that with a sufficient amount of the training sample, it is possible to build a neural network that determines the condition of the patient according to the data from the mobile phone sensors with a high probability.Entities:
Keywords: Parkinson’s disease; monitoring the condition of patients; motion sensor; recurrent neural network; smartphone
Year: 2020 PMID: 32290633 PMCID: PMC7235735 DOI: 10.3390/diagnostics10040214
Source DB: PubMed Journal: Diagnostics (Basel) ISSN: 2075-4418
Figure 1Examples of some patient application windows.
Parameters obtained using tests on mobile phones.
| Parameter | Parameter Description | Value Example | Unit |
|---|---|---|---|
| Text erased | Number of characters deleted | 13 | - |
| Text time | Time for writing text in a special test | 139,763 | ms |
| Levenstein Distance | Metric measuring the difference between two sequences of characters | 5 | - |
| Miss clicks | The number of misses when clicking buttons in the application | 4 | - |
| Miss clicks distance | The distance between the center of the nearest button and the center of the finger touch the phone screen | 2.357022 | dp |
| Azimuth | The longitudinal axis of the coordinate system | 70.29761 | degree |
| Pitch | The transverse axis of the coordinate system | −80.30805 | degree |
| Roll | The vertical axis of the coordinate system | 13.761927 | degree |
| Tapping left count | The number of touches by the index finger of the left hand of the button on the screen in 1 min in a special test | 47 | - |
| Tapping right count | The number of touches with the index finger of the right hand of the button on the screen for 1 min in a special test | 52 | - |
| Dyskinesia | The presence of dyskinesia | 1 | - |
| Pill | The number of medications taken | 4 | - |
| State | Subjective assessment of the patient’s condition (0 - poor, 0.5 - uncertain, 1 - good) | 1 | - |
| Voice volume | Voice volume | 44 | decibels |
| Voice pause | The number of pauses between words | 3 | - |
| Voice count pause | The pause time between words | 4794 | ms |
| Velocity | The speed of the phone during its active use | 1.342 | ms |
dp or dip (density-independent pixels) is an abstract unit of measurement that allows applications to look the same on different screens and resolutions, ms is milliseconds.
Figure 2The process of diagnosing the severity of symptoms of Parkinson’s disease (PD).
Figure 3Scheme of a simple neural network. Green indicates input neurons, blue indicates hidden neurons, and yellow indicates output neuron.
An example of input data.
| Azimuth | Pitch | Roll |
|---|---|---|
| 70.29761 | −80.30805 | 13.761927 |
| 80.7381 | 88.84665 | −0.4941308 |
| 103.91639 | 89.57664 | −23.674112 |
| 120.00478 | 89.701294 | −39.76287 |
| 136.86708 | 89.748215 | 59.555275 |
Data in the Table 2 is sorted by azimuth. But, the azimuth values in the first and last line are not necessarily the minimum and maximum values. Such numbers were obtained in this study. In other tests, they may be different.
Figure 4A fully recurrent version of a neural network. Blue and red are the first and second recurrent layers, green is the result.
Figure 5The results of training a neural network with one recurrent layer.
Figure 6The results of training a neural network with two recurrent layers.
Figure 7The results of training neural network with a fully recurrent architecture (gated recurrent unit; GRU).
Figure 8The results of neural network training with fully recurrent architecture (Long Short-Term Memory; LSTM).
The results of training of neural networks of various architectures.
| Architecture | The Number of Epoch before Retraining | Loss Function (BCE) | Accuracy |
|---|---|---|---|
| 1 GRU + 2 fully connected | 29 | 0.031161 | 0.817 |
| 2 GRU + 2 fully connected | 32 | 0.029795 | 0.88951 |
| 2 LSTM | 26 | 0.03809 | 0.8528 |
LSTM is Data in the Long Short-Term Memory, GRU is Gated Recurrent Unit, BCE is binary cross-entropy.