OBJECTIVE: People are increasingly encouraged to self-manage their chronic conditions; however, many struggle to practise it effectively. Most studies that investigate patient work (ie, tasks involved in self-management and contexts influencing such tasks) rely on self-reports, which are subject to recall and other biases. Few studies use wearable cameras and deep learning to capture and classify patient work activities automatically. MATERIALS AND METHODS: We propose a deep learning approach to classify activities of patient work collected from wearable cameras, thereby studying self-management routines more effectively. Twenty-six people with type 2 diabetes and comorbidities wore a wearable camera for a day, generating more than 400 h of video across 12 daily activities. To classify these video images, a weighted ensemble network that combines Linear Discriminant Analysis, Deep Convolutional Neural Networks, and Object Detection algorithms is developed. Performance of our model is assessed using Top-1 and Top-5 metrics, compared against manual classification conducted by 2 independent researchers. RESULTS: Across 12 daily activities, our model achieved on average the best Top-1 and Top-5 scores of 81.9 and 86.8, respectively. Our model also outperformed other non-ensemble techniques in terms of Top-1 and Top-5 scores for most activity classes, demonstrating the superiority of leveraging weighted ensemble techniques. CONCLUSIONS: Deep learning can be used to automatically classify daily activities of patient work collected from wearable cameras with high levels of accuracy. Using wearable cameras and a deep learning approach can offer an alternative approach to investigate patient work, one not subjected to biases commonly associated with self-report methods.
OBJECTIVE: People are increasingly encouraged to self-manage their chronic conditions; however, many struggle to practise it effectively. Most studies that investigate patient work (ie, tasks involved in self-management and contexts influencing such tasks) rely on self-reports, which are subject to recall and other biases. Few studies use wearable cameras and deep learning to capture and classify patient work activities automatically. MATERIALS AND METHODS: We propose a deep learning approach to classify activities of patient work collected from wearable cameras, thereby studying self-management routines more effectively. Twenty-six people with type 2 diabetes and comorbidities wore a wearable camera for a day, generating more than 400 h of video across 12 daily activities. To classify these video images, a weighted ensemble network that combines Linear Discriminant Analysis, Deep Convolutional Neural Networks, and Object Detection algorithms is developed. Performance of our model is assessed using Top-1 and Top-5 metrics, compared against manual classification conducted by 2 independent researchers. RESULTS: Across 12 daily activities, our model achieved on average the best Top-1 and Top-5 scores of 81.9 and 86.8, respectively. Our model also outperformed other non-ensemble techniques in terms of Top-1 and Top-5 scores for most activity classes, demonstrating the superiority of leveraging weighted ensemble techniques. CONCLUSIONS: Deep learning can be used to automatically classify daily activities of patient work collected from wearable cameras with high levels of accuracy. Using wearable cameras and a deep learning approach can offer an alternative approach to investigate patient work, one not subjected to biases commonly associated with self-report methods.
Authors: Aiden R Doherty; Steve E Hodges; Abby C King; Alan F Smeaton; Emma Berry; Chris J A Moulin; Siân Lindley; Paul Kelly; Charlie Foster Journal: Am J Prev Med Date: 2013-03 Impact factor: 5.043
Authors: Ali Hassan Alhaiti; Mohammed Senitan; Wireen Leila T Dator; Chandrakala Sankarapandian; Nadiah Abdulaziz Baghdadi; Linda Katherine Jones; Cliff Da Costa; George Binh Lenon Journal: J Diabetes Res Date: 2020-10-06 Impact factor: 4.011
Authors: Richard J McManus; Jonathan Mant; Marloes Franssen; Alecia Nickless; Claire Schwartz; James Hodgkinson; Peter Bradburn; Andrew Farmer; Sabrina Grant; Sheila M Greenfield; Carl Heneghan; Susan Jowett; Una Martin; Siobhan Milner; Mark Monahan; Sam Mort; Emma Ogburn; Rafael Perera-Salazar; Syed Ahmar Shah; Ly-Mee Yu; Lionel Tarassenko; F D Richard Hobbs Journal: Lancet Date: 2018-02-27 Impact factor: 79.321
Authors: Kathleen Yin; Joshua Jung; Enrico Coiera; Liliana Laranjo; Ann Blandford; Adeel Khoja; Wan-Tien Tai; Daniel Psillakis Phillips; Annie Y S Lau Journal: J Med Internet Res Date: 2020-06-02 Impact factor: 5.428