Literature DB >> 30381808

Deep Learning for RFID-Based Activity Recognition.

Xinyu Li1, Yanyi Zhang1, Ivan Marsic1, Aleksandra Sarcevic2, Randall S Burd3.   

Abstract

We present a system for activity recognition from passive RFID data using a deep convolutional neural network. We directly feed the RFID data into a deep convolutional neural network for activity recognition instead of selecting features and using a cascade structure that first detects object use from RFID data followed by predicting the activity. Because our system treats activity recognition as a multi-class classification problem, it is scalable for applications with large number of activity classes. We tested our system using RFID data collected in a trauma room, including 14 hours of RFID data from 16 actual trauma resuscitations. Our system outperformed existing systems developed for activity recognition and achieved similar performance with process-phase detection as systems that require wearable sensors or manually-generated input. We also analyzed the strengths and limitations of our current deep learning architecture for activity recognition from RFID data.

Entities:  

Keywords:  Activity recognition; convolutional neural network; deep learning; passive RFID; process phase detection

Year:  2016        PMID: 30381808      PMCID: PMC6205502          DOI: 10.1145/2994551.2994569

Source DB:  PubMed          Journal:  Proc Int Conf Embed Netw Sens Syst


  5 in total

1.  Comparison of the predicted and observed secondary structure of T4 phage lysozyme.

Authors:  B W Matthews
Journal:  Biochim Biophys Acta       Date:  1975-10-20

2.  Object detection with discriminatively trained part-based models.

Authors:  Pedro F Felzenszwalb; Ross B Girshick; David McAllester; Deva Ramanan
Journal:  IEEE Trans Pattern Anal Mach Intell       Date:  2010-09       Impact factor: 6.226

3.  Modeling and online recognition of surgical phases using Hidden Markov Models.

Authors:  Tobias Blum; Nicolas Padoy; Hubertus Feussner; Nassir Navab
Journal:  Med Image Comput Comput Assist Interv       Date:  2008

4.  Automatic phase prediction from low-level surgical activities.

Authors:  Germain Forestier; Laurent Riffaud; Pierre Jannin
Journal:  Int J Comput Assist Radiol Surg       Date:  2015-04-23       Impact factor: 2.924

Review 5.  Deep learning.

Authors:  Yann LeCun; Yoshua Bengio; Geoffrey Hinton
Journal:  Nature       Date:  2015-05-28       Impact factor: 49.962

  5 in total
  6 in total

1.  Progress Estimation and Phase Detection for Sequential Processes.

Authors:  Xinyu Li; Yanyi Zhang; Jianyu Zhang; Moliang Zhou; Shuhong Chen; Yue Gu; Yueyang Chen; Ivan Marsic; Richard A Farneth; Randall S Burd
Journal:  Proc ACM Interact Mob Wearable Ubiquitous Technol       Date:  2017-09

2.  Video-based Concurrent Activity Recognition for Trauma Resuscitation.

Authors:  Yanyi Zhang; Yue Gu; Ivan Marsic; Yinan Zheng; Randall S Burd
Journal:  IEEE Int Conf Healthc Inform       Date:  2021-03-12

3.  Edge-Based Transfer Learning for Classroom Occupancy Detection in a Smart Campus Context.

Authors:  Lorenzo Monti; Rita Tse; Su-Kit Tang; Silvia Mirri; Giovanni Delnevo; Vittorio Maniezzo; Paola Salomoni
Journal:  Sensors (Basel)       Date:  2022-05-12       Impact factor: 3.847

4.  Multimodal Attention Network for Trauma Activity Recognition from Spoken Language and Environmental Sound.

Authors:  Yue Gu; Ruiyu Zhang; Xinwei Zhao; Shuhong Chen; Jalal Abdulbaqi; Ivan Marsic; Megan Cheng; Randall S Burd
Journal:  IEEE Int Conf Healthc Inform       Date:  2019-11-21

Review 5.  State-of-the-art of situation recognition systems for intraoperative procedures.

Authors:  D Junger; S M Frommer; O Burgert
Journal:  Med Biol Eng Comput       Date:  2022-02-17       Impact factor: 2.602

6.  Human activity recognition in artificial intelligence framework: a narrative review.

Authors:  Neha Gupta; Suneet K Gupta; Rajesh K Pathak; Vanita Jain; Parisa Rashidi; Jasjit S Suri
Journal:  Artif Intell Rev       Date:  2022-01-18       Impact factor: 9.588

  6 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.