| Literature DB >> 35046646 |
Grégory Ben-Sadoun1,2, Emeline Michel3,4, Cédric Annweiler1,5,6,7, Guillaume Sacco3,5,8.
Abstract
Systems using passive infrared sensors with a low resolution were recently proposed to answer the dilemma effectiveness-ethical considerations for human fall detection by Information and Communication Technologies (ICTs) in older adults. How effective is this type of system? We performed a systematic review to identify studies that investigated the metrological qualities of passive infrared sensors with a maximum resolution of 16×16 pixels to identify falls. The search was conducted on PubMed, ScienceDirect, SpringerLink, IEEE Xplore Digital Library, and MDPI until November 26-28, 2020. We focused on studies testing only these types of sensor. Thirteen articles were "conference papers", five were "original articles" and one was a found in arXiv.org (an open access repository of scientific research). Since four authors "duplicated" their study in two different journals, our review finally analyzed 15 studies. The studies were very heterogeneous with regard to experimental procedures and detection methods, which made it difficult to draw formal conclusions. All studies tested their systems in controlled conditions, mostly in empty rooms. Except for two studies, the overall performance reported for the detection of falls exceeded 85-90% of accuracy, precision, sensitivity or specificity. Systems using two or more sensors and particular detection methods (eg, 3D CNN, CNN with 10-fold cross-validation, LSTM with CNN, LSTM and Voting algorithms) seemed to give the highest levels of performance (> 90%). Future studies should test more this type of system in real-life conditions.Entities:
Keywords: fall detection; older adults; passive infrared sensor; thermal sensor; thermopile
Mesh:
Year: 2022 PMID: 35046646 PMCID: PMC8763199 DOI: 10.2147/CIA.S329668
Source DB: PubMed Journal: Clin Interv Aging ISSN: 1176-9092 Impact factor: 4.458
Figure 1PRISMA flow diagram.
Studies Characteristics Regarding Their Sensors Used, Experimental Procedures, Detection Methods and Detection Performance
| Authors | Sensors (Name; Pixels; Number of Sensors; Positions) | Experimental Procedures | Detection Methods | Performance | |||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Environment (Number of Participants; Organization of the Room; Ambient Temperature) | Activities | Raw Data Post-Treatment | Background Generation | Background Substraction | Features | Classifiers | Ac (%) | Pr (%) | Se or ESe (%) | Sp or ESp (%) | F1 (%) | ||||
| Adolf et al., 201827 | Panasonic’s Grid-EYE® AMG88xx | 4 | Postures: | – | – | – | – | CNN Inception V347 | – | – | Postures: 48 | Postures: 89 | – | ||
| Chen & Ma, 201528 | Melexis | 5 | Actions: | White noise signal reduction with multi- frame averaging | Yes | – | 7 | k-NN | Global | – | Global | Global | – | ||
| Chen & Wang, 201829 | Panasonic’s Grid-EYE® AMG8853 | 3 | Actions: | – | Yes | – | 5 | SVM | Actions: | – | – | – | – | ||
| Fan et al., 201730 | Panasonic’s Grid-EYE® AMG8832 | 1 | Actions: | (nf) no filter | – | – | – | GRU | – | Classifiers: (a)(nf) 75.0 | Classifiers: (a)(nf) 91.6 | – | Classifiers: (a)(nf) 82.5 | ||
| Gochoo et al., 201832 | Panasonic’s Grid-EYE® AMG8833 | 4 (Youngs) | Postures: | Heatmap increased by 2 | – | – | – | DCNN | Postures: 99.83 | Postures: 99.38 | Postures: 99.17 | Postures: 99.92 | Postures: 99.27 | ||
| Hayashida et al., 201733,34 | Panasonic’s Grid-EYE® AMG8831 | 7 | Scenario: | – | Yes | – | 5 | Their own classifier based on several thresholds of the five features | (a) Ac Fall from 18–28°C | – | – | – | – | ||
| Liu et al., 202035 | Panasonic’s Grid-EYE® AMG8853 | 8 | Actions: | Bicubic interpolation | Yes | Yes | 8 | RandFor | – | Global | Global | – | Global | ||
| Mashiyama et al., 201422 | Panasonic’s Grid-EYE® AMG8831 | 6 | Actions: | – | Yes | – | 4 | k-NN | Global | – | – | – | – | ||
| Mashiyama et al., 201523 | Panasonic’s Grid-EYE® AMG8831 | 6 | Actions: | – | Yes | – | 4 | SVM | Actions: 100 | – | – | – | – | ||
| Ogawa & Naito, 202036 | Melexis MLX90621 | 10 | Actions: | – | Yes | – | 3 | LDA | – | – | – | – | |||
| Shelke & Aksanli, 201937 | Melexis MLX90621 | 10 | Postures: | – | Yes | Yes | – | Logistic | Classifiers: 99.94–99.97 | – | – | – | Classifiers: 99.88–99.95 | ||
| Sixsmith & Johnson, 200421 | Irisys | 1 | Actions: | – | – | – | – | MLP | Non-Fall: | ||||||
| Taniguchi et al., 201438 | Omron Corporation, D6T-1616-L-06 | 2 | Actions: | – | – | – | 9 | Their own classifier based on the logic | Mean Ac | – | Mean ESe | Mean ESp | – | ||
| Tao et al., 201840 | Panasonic’s Grid-EYE® AMG88xx | 8 | Actions: | – | Yes | Yes | 2 | SVM | Global with SVM 40 | – | – | – | – | ||
| Taramasco et al., 201841 | Omron Corporation, D6T-8L-06 | 4 (Youngs) | Scenarios of falls: | – | – | – | – | CNN + LSTM | – | ||||||
Abbreviations: °C, degree Celsius; %, per cent; Ac, accuracy; AdaBoost, adaptive boosting; Bagging, bootstrap aggregating; BiLSTM, bidirectional long short-term memory; CNN, convolutional neural network; DecTree, decision tree; ESe, error of sensibility; ESp, error of specificity; FP, false positive; FN, false negative; fNN, feed-forward neural network; GRU, gated recurrent unit; GRU-ATT, gated recurrent unit with attention link; LDA, linear discriminant analysis; Logistic, Logistic Regression; LSTM, long short-term memory; LSTM-ATT, long short-term memory with attention link; lux, luminous flux per unit area; k-NN, k-nearest neighbors; m, meter; MLP, multilayer perceptron; NaiveB, Naive Bayes; Pr, precision; s, second; Se, sensibility; Sp, specificity; RandFor, Random Forest; RNN, recurrent neural network; SVM, support vector machine; TN, true negative; TP, true positive.
Figure 2Publication dates of studies.
Summary of System Architectures Used in the Studies
| Systems (Main Components) | Power Consumption | Type of Data Transfer To PC | ||
|---|---|---|---|---|
| Adolf et al., 2018 | IR sensor | AMG88xx (unspecified) | 4.5mA* | USB and Bluetooth |
| Microcontroller | Panasonic evaluation board | ND | ||
| Chen & Ma, 2015 | IR sensor | MLX90620 | 9mA* | UART |
| Microcontroller | Arduino Uno | NC | ||
| Chen & Wang, 2018 | IR sensor | AMG8853 | 4.5mA* | Bluetooth |
| Ultrasonic sensor | HC-S04 | ND | ||
| Microcontroller | Arduino Mega | ND | ||
| Battery | 7.4V Li-on | ND | ||
| Fan et al., 2017 and 2018 | IR sensor | AMG8832 | 4.5mA* | ZigBee |
| Microcontroller | ZigBee CC2530 | ND | ||
| Gochoo et al., 2018 | IR sensor | AMG8833 | 4.5mA* | Wi-Fi |
| Microcontroller | ESP32S | ND | ||
| Transmitter | WiPy 2 Module | ND | ||
| Hayashida et al., 201 | IR sensor | AMG8831 | 4.5mA* | Xbee |
| Microcontroller | Arduino Uno Rev.3 | 1.4mA@3.3V** | ||
| Transmitter | Xbee transmitter | 50mA@3.3V** | ||
| Liu et al., 2020 | IR sensor | AMG8853 | 4.5mA* | USB |
| Mashiyama et al., 2014 | IR sensor | AMG8831 | 4.5mA* | USB |
| Microcontroller | Arduino Uno Rev.3 | ND | ||
| Mashiyama et al., 2015 | IR sensor | AMG8831 | 4.5mA* | USB |
| Microcontroller | Arduino Uno Rev.3 | ND | ||
| Ogawa & Naito, 2020 | IR sensor | MLX90621 | 9mA* | ND |
| Microcontroller | Raspberry Pi 3 Model B | ND | ||
| Shelke & Aksanli, 2019 | IR sensor | MLX90621 | 9mA* | Wi-Fi/MQTT |
| Microcontroller 1 | Arduino (unspecified) | ND | ||
| Microcontroller 2 | Raspberry Pi 3 Model B | ND | ||
| Sixsmith & Johnson, 2004 | IR sensor | Irisys 16 x 16 | ND | ND |
| Taniguchi et al., 2014 | IR sensor | D6T-1616-L-06 | ND | Wired connection |
| Tao et al., 2018 and 2019 | IR sensor | AMG88xx (unspecified) | 4.5mA* | ND |
| Taramasco et al., 2018 and 2020 | IR sensor | D6T-8L-06 | ND | UART (PC)/3G (Smartphone) |
| Microcontroller 1 | ATMEGA328P | ND | ||
| Microcontroller 2 | Single-board ODROID-C1+ computer | ND | ||
| Transmitter | Modem 3G | ND | ||
Notes: *Manufacturer specifications. **Specified by the authors.
Abbreviations: IR, infrared; mA, milliampere; ND, not disclosed; V, volt.