| Literature DB >> 35336551 |
Alexey Kashevnik1, Andrew Ponomarev1, Nikolay Shilov1, Andrey Chechulin1.
Abstract
This paper presents an approach and a case study for threat detection during human-computer interaction, using the example of driver-vehicle interaction. We analyzed a driver monitoring system and identified two types of users: the driver and the operator. The proposed approach detects possible threats for the driver. We present a method for threat detection during human-system interactions that generalizes potential threats, as well as approaches for their detection. The originality of the method is that we frame the problem of threat detection in a holistic way: we build on the driver-ITS system analysis and generalize existing methods for driver state analysis into a threat detection method covering the identified threats. The developed reference model of the operator-computer interaction interface shows how the driver monitoring process is organized, and what information can be processed automatically, and what information related to the driver behavior has to be processed manually. In addition, the interface reference model includes mechanisms for operator behavior monitoring. We present experiments that included 14 drivers, as a case study. The experiments illustrated how the operator monitors and processes the information from the driver monitoring system. Based on the case study, we clarified that when the driver monitoring system detected the threats in the cabin and notified drivers about them, the number of threats was significantly decreased.Entities:
Keywords: intelligent transportation systems; smartphone sensors; threats detection
Mesh:
Year: 2022 PMID: 35336551 PMCID: PMC8949224 DOI: 10.3390/s22062380
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1General scheme of the driver and operator interaction in the driver monitoring system.
Figure 2Proposed threat detection method.
Technologies for stages of the threat detection method.
| Threat | Capturing | Pre-Processing | Parameter Computing | Driver State Identifying | |
|---|---|---|---|---|---|
| What | How | ||||
| Fatigue | Heart rate | ECG (electrocardiogram) | - | Heart rate variability (HRV) evaluated as standard deviation of the intervals of instantaneous heart rate values (SDNN) | Fatigue is detected if SDNN < 141 +/− 39 ms [ |
| fNIRS (near-infrared functional spectroscopy) | - | Oxygenated hemoglobin HbO2 | Fatigue is detected if HbO2 > 2 [ | ||
| Muscle Fatigue | EMG (electromyography) | - | Peak coefficient of the EMG signal | Fatigue is detected if Fc > 0.15 [ | |
| Macroscopic activity of the surface layer of the brain | EEG | - | Specific bursts in the alpha rhythm | Fatigue is detected if specific bursts are present [ | |
| Eyes | Camera | Neural networks/Haar cascades | Blinking frequency (Vb) | Fatigue is detected if Vb > 13 times/minute | |
| PERCLOS (closing time of the eyelids by more than 80%) | Fatigue is detected if the PERCLOS > 28% of the time within one minute [ | ||||
| Neural networks | ELDC (distance between the eyelids) | Fatigue is detected if ELDC > 0.5 [ | |||
| EOG (electrooculography) sensor | - | Voltage U | Fatigue is detected if U > 50 µV [ | ||
| Mouth | Camera | Neural networks | Mouth PERCLOS (closing time of the mouth by more than 50%) | Fatigue is detected if the mouth PERCLOS < 30% [ | |
| Face | Camera | Neural networks | Skin temperature | Fatigue is detected if skin temperature drops by 0.1 °C [ | |
| Body | IR Thermometer | - | |||
| Body | Camera | Neural networks | Breath rate (Tbr) | Fatigue is detected if Tbr < 16 times/min [ | |
| Car dynamics | GPS | CatBoost | No specific parameter or threshold. The machine learning classification model identifies the presence/absence of the treat. [ | ||
| Inattention | Face/Head | Camera | Neural networks/Haar cascades | Driver head’s Euler angles (yaw, pitch, roll) detection (RMAX) | |
| Eyes | Camera | Neural networks | View direction (RMAX) | ||
| Driver | Camera | Neural networks | Presence of the pre-defined objects (food/drink, mobile phone, cigarette) | Inattention is detected if a pre-defined object is present for X seconds [ | |
| Irritation | Noise | Microphone | To be researched | Noise level | To be researched [ |
| Talking | Microphone | To be researched | Time of talking | To be researched [ | |
| Irritating sounds | Microphone | To be researched | e.g., repeating noise | To be researched [ | |
Figure 3Reference model of operator–computer interaction interface.
Figure 4Screenshot example: vehicle route, threat lists, in-cabin, and outside videos.
Figure 5Screenshot example: choosing images for dataset retraining.
Figure 6Screenshot example: rejecting of the detected threat by a dispatcher.
In-the-wild experiments.
| Drivers | Trips, km | Threats | Threat Frequency (Pc. on 10 km) | Frequency Change | ||||
|---|---|---|---|---|---|---|---|---|
| Passive Mode | Active Mode | Passive Mode | Active Mode | Passive Mode | Active Mode | Pc. on 10 km | % | |
| Driver 1 | 64 | 49 | 86 | 25 | 13.4 | 5.1 | −8.3 | −62.0 |
| Driver 6 | 562 | 440 | 438 | 190 | 7.8 | 4.3 | −3.5 | −44.6 |
| Driver 7 | 605 | 253 | 424 | 168 | 7.0 | 6.6 | −0.4 | −5.3 |
| Driver 13 | 243 | 147 | 292 | 20 | 12.0 | 1.4 | −10.7 | −88.7 |
| Driver 14 | 220 | 250 | 109 | 59 | 5.0 | 2.4 | −2.6 | −52.4 |