| Literature DB >> 29946251 |
Mustafa Atee1, Kreshnik Hoti1,2, Jeffery D Hughes1.
Abstract
Background: Pain in dementia is predominant particularly in the advanced stages or in those who are unable to verbalize. Uncontrolled pain alters the course of behaviors in patients with dementia making them perturbed, unsettled, and devitalized. Current measures of assessing pain in this population group are inadequate and underutilized in clinical practice because they lack systematic evaluation and innovative design. Objective: To describe a novel method and system of pain assessment using a combination of technologies: automated facial recognition and analysis (AFRA), smart computing, affective computing, and cloud computing (Internet of Things) for people with advanced dementia. Methods andEntities:
Keywords: PainChek™; artificial intelligence; automated facial recognition; dementia; pain assessment system; smart device application; technology
Year: 2018 PMID: 29946251 PMCID: PMC6006917 DOI: 10.3389/fnagi.2018.00117
Source DB: PubMed Journal: Front Aging Neurosci ISSN: 1663-4365 Impact factor: 5.750
Glossary of technical terms used.
| Cognification | The process of making objects smarter by combining, connecting and/or integrating 2 or more technologies; one of which is AI (Kelly, |
| Artificial Intelligence (AI) | “The scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines” (AAAI, |
| Smart computing | A generation of integrated hardware, software, and network technologies that provide IT systems with real-time awareness of the real world and advanced analytics to help people make more intelligent decisions about alternatives and actions that will optimize processes (Bartels et al., |
| Cloud computing | A model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction (Mell and Grance, |
| Affective computing | Computing that relates to, arises from, or influences emotions (Picard, |
| Internet of Things (IoT) | The networked interconnection of everyday objects, which are often equipped with ubiquitous intelligence. Also known as “Internet of Objects” (Xia et al., |
| Deep learning | A pattern recognition technique that allows computational models that are composed of multiple processing layers to learn representations of data with multiple levels of abstraction (LeCun et al., |
| Smart device | An electronic or digital mobile device that has advanced computational processing power, possess multiple capabilities (e.g., voice and video communication, data storage), can operate independently and interactively by being linked to other devices or networks via various wireless connections e.g., Wi-Fi, Bluetooth (Bamidis et al., |
| Android | A mobile operating system designed primarily for touchscreen devices such as smartphone and tablet computers, and for other electronics such as smart televisions (Android TV), and smart watches (Bamidis et al., |
| iOS | A mobile operating system developed by Apple, which works in a similar way to the Android. |
Glossary of terms used to describe psychometric and clinimetric properties of pain assessment tools.
| Validity | The degree to which an instrument measures what it is intended to measure (Polit and Hungler, |
| Concurrent validity | The degree to which scores on an instrument are correlated with some external criterion, measured at the same time (Polit and Hungler, |
| Discriminant validity | An approach to construct validation that involves assessing the degree to which a single method of measuring two constructs yields different results (i.e., discriminates the two; Polit and Hungler, |
| Predictive validity | The degree to which an instrument can predict some criterion observed at a future time (Polit and Hungler, |
| Reliability | The degree of consistency or dependability (i.e., repeatability) with which an instrument measures the attribute it is designed to measure. |
| Interrater reliability | The degree to which two raters or observers, operating independently, assign the same ratings or values for an attribute being measured (Polit and Hungler, |
| Test-retest reliability | A procedure used to determine the stability of measurements over time (Waltz et al., |
| Internal consistency | The degree to which two or more measures are essentially measuring the same construct (Portney and Mary, |
| Sensitivity (SE) | Probability that a test result will be positive when the disease is present (true positive rate; Altman et al., |
| Specificity (SP) | Probability that a test result will be negative when the disease is not present (true negative rate; Altman et al., |
| Accuracy | Overall probability that a patient will be correctly classified (Altman et al., |
| Clinical utility | The usefulness of the measure for decision making (van Herk et al., |
| Clinical Utility Index (CUI) | The overall value of a test for combined screening and case finding (Mitchell, |
Clinical and technical characteristics of PainChek™ system.
| Smart device enabled application | A point of care mobile application, which consists of: |
| Web administration portal ( | A secure website that allows the management of patients and users data |
| Smart device | A smart phone or tablet to deliver point of care pain assessments, and to capture temporal patterns of pain scores |
| PC or smart device | A computing device for WAP access |
| Operation system of the App | Android or iOS |
| Operation procedure |
Download and Install the App Log in and set up user profile Enter details for a new patient or select an existing patient. |
| Time to set up the App | The iOS PainChek App is a 55 MB download. Assuming a download speed of 40 Mbs the average speed of a mobile connection in Australia as of mid-2017 (as reported by |
| Administration skills |
Familiarity with the use of smart device Familiarity with the patient undergoing assessment Basic knowledge of pain behaviors in dementia. |
| Target users | Clinicians and carers |
| Training needs |
User competence on the use of smart device technology and operation of the App Clinical competence on tool's contents, domains, and descriptors. |
| Training resources |
Video tutorials accessible through the PainChek website FAQs (text and illustrating pictures) accessible through the website Face-to-face workshop for enterprise users. |
| All materials are currently available in English but other languages are planned. | |
| Pain scale ( | 42 items distributed across 6 domains: |
| Pain chart ( | A graphical representation of pain scores over a period of time |
| Pain assessment log ( | A list of pain assessments completed with their corresponding time and dates |
| Patient database | A local repository of patients' data including demographics |
| Medications and therapies log | A local repository of medications and therapies of each patient |
| Front camera mode | Automated facial analysis using the front camera of a smart device |
| Back camera mode | Automated facial analysis using the back camera of a smart device |
| Manual mode | Manually completed facial assessment (optional) |
| Scoring format | Binary (yes/no) checklist |
| Scoring instructions |
Observe the patient Use the AFRA in the Face domain to detect facial action unit descriptors Complete the corresponding checklists for the remaining non-facial domains The App automatically calculates a pain intensity score, which conforms to one of the pain category bands below. |
| Scoring interpretation (total pain scores) | 0-6 (No Pain), 7-11 (Mild Pain), 12-15 (Moderate Pain), ≥ 16 (Severe Pain) |
| Ideal conditions of pain assessments |
Assess pain at rest (e.g. sitting) and immediately after movement (e.g. repositioning) Assess and re-assess (e.g., 1 h post-intervention). |
| Time to complete scoring of total scale | ≤ 1 min |
| Time to complete scoring of the Face domain (automated) | 3 s |
| Study 1 (Atee et al., | Design: prospective observational study; Setting: RACFs; Sampling: purposive convenience; Time line: 13 weeks, N: 40; Age: 60–98 years |
| Study 2 (Atee et al., | Design: prospective observational study; Setting: RACFs; Sampling: purposive convenience; Time line: 10 weeks; N: 34; Age: 68–93 years |
| Study 3 (Hoti et al., | Design: |
| Concurrent validity | Excellent |
| Discriminant validity | Good (regression model not significantly influenced by the timing of the assessment i.e. at rest vs. with movement) |
| Internal consistency reliability | Excellent homogeneity |
| Inter-rater reliability | Good-to-excellent |
| Test-retest reliability | Excellent |
| Predictive validity | Good based on the following data: |
| Clinical utility | Excellent based on the following data: |
| Accuracy | Excellent based on the following data: |
| Browser compatibility | Chrome (version 59.0 or later), Mozilla (version 54.0 or later), Internet Explorer (version 11 or later) |
| Operating system | Windows (7 or later), or Macintosh (OS X Mavericks 10.9 or later) |
| Data hosting product | Amazon Elastic Compute Cloud (Amazon EC2) (AWS, |
N, number of subjects with moderate to severe dementia; RACFs, residential aged care facilities; r, Pearson's correlation coefficient; α, Cronbach alpha; κ.
Figure 2(A) PainChek™ pain assessment tool-The Face (Domain 1). (B) PainChek™ pain assessment tool-The Voice (Domain 2). (C) PainChek™ pain assessment tool-The Movement (Domain 3). (D) PainChek™ pain assessment tool-The Behavior (Domain 4). (E) PainChek™ pain assessment tool-The Activity (Domain 5). (F) PainChek™ pain assessment tool-The Body (Domain 6). (G) PainChek™ pain assessment tool-Summary screen. (H) PainChek™ pain assessment tool–Saving assessment. (I) PainChek™ App—“Dashboard” screen. (J) PainChek™ App—“Assessments” log. (K) PainChek™ App—“Pain Chart.” (L) PainChek™ App—“Pain Relief” list. (M) PainChek™ App—“Comments” section.
Figure 3PainChek™ Web Admin Portal (WAP).
Figure 4“Clinical Guide” handout.
Figure 1Conceptual model of the PainChek™ system.