Literature DB >> 35800668

Lessons from a multicenter clinical trial with an approved wearable electrocardiogram: issues and practical considerations.

Ki Young Huh1, Sae Im Jeong1, Hyounggyoon Yoo2, Meihua Piao3, Hyeongju Ryu4, Heejin Kim5, Young-Ran Yoon6, Sook Jin Seong6, SeungHwan Lee1, Kyung Hwan Kim7.   

Abstract

Although wearable electrocardiograms (ECGs) are being increasingly applied in clinical settings, validation methods have not been standardized. As an exploratory evaluation, we performed a multicenter clinical trial implementing an approved wearable patch ECG. Healthy male adults were enrolled in 2 study centers. The approved ECGs were deployed for 6 hours, and pulse rates were measured independently with conventional pulse oximetry at selected time points for correlation analyses. The transmission status of the data was evaluated by heart rates and classified into valid, invalid, and missing. A total of 55 subjects (40 in center 1 and 15 in center 2) completed the study. Overall, 77.40% of heart rates were within the valid range. Invalid and missing data accounted for 1.42% and 21.23%, respectively. There were significant differences in valid and missing data between centers. The proportion of missing data in center 1 (24.77%) was more than twice center 2 (11.77%). Heart rates measured by the wearable ECG and conventional pulse oximetry showed a poor correlation (intraclass correlation coefficient = 0.0454). In conclusion, we evaluated the multicenter feasibility of implementing wearable ECGs. The results suggest that systems to mitigate multicenter discrepancies and remove artifacts should be implemented prior to performing a clinical trial. Trial Registration: ClinicalTrials.gov Identifier: NCT05182684.
Copyright © 2022 Translational and Clinical Pharmacology.

Entities:  

Keywords:  Clinical Trial; Multicenter Study; Wearable Electronic Devices

Year:  2022        PMID: 35800668      PMCID: PMC9253449          DOI: 10.12793/tcp.2022.30.e7

Source DB:  PubMed          Journal:  Transl Clin Pharmacol        ISSN: 2289-0882


INTRODUCTION

Wearable electrocardiograms (ECGs) are increasingly applied for the detection of cardiac arrhythmias [1]. Smartphones [2] or smartwatches integrated with photoplethysmographic (PPG) sensors [3] are easily accessible and wearable ECGs. Despite the convenience of use in everyday settings, wearable ECGs based on PPG sensors do not accurately distinguish several clinically important arrhythmias, such as atrial fibrillations and premature beats [1]. In contrast, wearable ECGs based on direct electrocardiographic recordings monitor more comprehensive cardiac events [1]. Most cardiologists prefer wearable ECGs based on direct electrocardiographic recordings over PPG sensors, especially in the diagnosis of atrial fibrillations [4]. In contrary to the widespread use of wearable ECGs in clinics, methods for clinical validation have not been standardized [56]. A recent review on wearable devices in elderly individuals showed that data collection and analysis methods were considerably heterogeneous between studies [7]. Another study noted that several validation studies were prone to bias due to the small number of patients [6]. We also noted that the number of subjects in each study varied greatly between studies, ranging from less than 10 subjects [8] to thousands [9]. One study used only a single subject [10]. Our attention focused on possible issues during the implementation of wearable ECGs in multicenter clinical trials. Multicenter clinical trials have various issues before and during the trials [11]. These issues also pertain to clinical trials using wearable ECGs, where data acquisition is dependent on the particular settings of the study center, such as wireless networks. Staff training and retention issues may also arise, especially when a study center is not fully dedicated to clinical trials [12]. These issues may be an unexpected cause of data loss or low data quality, which hinders data interpretation. These issues are intensified in decentralized clinical trials, where the widespread implementation of wearable devices in diverse circumstances is expected [13]. The present study performed a multicenter clinical trial using an approved wearable patch ECG. We previously developed a multipurpose clinical trial platform (i.e., smart clinical trial platform) tailored to decentralized clinical trials. This study specifically focused on the exploratory evaluation of wearable ECGs in multicenter settings.

METHODS

Study subjects and design

Healthy male adults without any contraindications for the attachment of patch electrodes were eligible for the study. Written informed consent forms were obtained prior to any study-related procedures. The study was performed independently in 2 study centers. The wearable ECG patches were deployed for 6 hours. Subjects tracked their daily activities (e.g., having lunch) in an activity log during the measurement period. Pulse rates were simultaneously measured using conventional pulse oximetry (CARESCAPE Monitor B650; GE Healthcare Finland Oy, Helsinki, Finland) at the 2 selected time points (i.e., 15 minutes after deployment of the device and 15 minutes before completion of the measurement) to evaluate the concordance of the results. After completion of the measurements, subjects completed a user experience questionnaire that was adapted from validated user experience questionnaires [1415]. The study was performed in accordance with the Declaration of Helsinki and Good and Clinical Practice. The institutional review board of Seoul National University Hospital and Hospital and Kyungpook National University Hospital approved the study (clinicaltrials.gov registration No. NCT05182684).

Data collection

The wearable ECG (VP-100; Tribell Lab, Gyeongsan, Korea) was approved by the Ministry of Food and Drug Safety. The VP-100 weighed 50 g and was attached to the anterior chest wall with a single-lead interface (Supplementary Fig. 1). The sampling rate was 250 samples/sec, and communication was performed using dedicated software within 10 m using Bluetooth version 2.1. The VP-100 transmitted ECG signals to the dedicated software every 4 msec, which corresponded to 25 mm in the ECG strip on the screen. Heart rates were calculated per second. An Android mobile phone (Galaxy S8; Samsung electronics, Suwon, Korea) streamed the processed ECG signals and heart rates to the streaming server on an epoch basis that corresponded to 90 seconds. The streaming server displayed the ECG signals and heart rates of each device in the central management platform. Physicians in the clinical trial evaluated the ECG signals in real time. An automatic arrhythmia detection algorithm was not used because the study primarily evaluated the real-world transmission validity of the data (Supplementary Fig. 2).

Evaluation of transmission status

Each epoch of ECG data was sorted in timestamp order and appended to calculate data completeness. Total measurement time was defined as the interval between the start and end of the measurements recorded separately by the investigators. Data were defined as ‘valid’ when the heart rate was positive and less than 200 beats per minute. When the heart rates calculated were out of this boundary, the data were labeled ‘invalid.’ When there were no records for heart rates at the scheduled time points, then the data were labeled ‘missing.’ Data completeness was defined as the proportion (%) and duration (hours) of the valid records during the total measurement time. The daily activities reported by subjects were overlaid on the measurement period. The transmission rate between centers was analyzed using Student’s t-test.

Comparison between the wearable device and the conventional device

Heart rates measured by the wearable ECG (VP-100) and conventional pulse oximetry were time-matched for comparison. The investigators manually collected heart rates using conventional pulse oximetry at the 2 scheduled time points for each subject. To minimize the time discrepancy between devices, heart rates before and after 30 seconds of the target time point measured by the wearable ECG were obtained, and the median of these values was selected for comparison. Time-matched heart rates were analyzed for concordance using scatter plot and intraclass correlation coefficient (ICC) based on a 2-way random effects model. Bland–Altman analysis was performed using the mean and difference of 2 measurements. The 95% confidence intervals were calculated as the mean of the Bland–Altman difference ± 1.96 × standard deviations (SDs) for the Bland–Altman difference. Statistical analyses were performed using R version 4.1.0 (R Foundation for Statistical Computing, Vienna, Austria).

RESULTS

Subject demographics

A total of 55 subjects (40 in center 1 and 15 in center 2) were enrolled and completed the study. The mean and SD of age was 31.2 ± 7.2 years. There was no significant difference in age between the 2 centers.

Transmission status

Overall, 77.4% of heart rates were within the valid range. Invalid and missing data accounted for 1.42% and 21.23%, respectively. Missing data were classified into 2 types: regular missing data < 15 seconds and irregular missing data with longer duration. Daily activities were associated with fluctuations in heart rates, but fluctuations without daily activities were also observed. There were significant differences in valid and missing data between centers. The proportion of missing data in center 1 (24.77%) was more than twice in center 2 (11.77%). The proportion of invalid data was also higher in center 1 (1.76%) than center 2 (0.56%), but the difference was not statistically significant. In contrast, more data were valid in center 2 (87.66%) than center 1 (73.55%) (Table 1, Fig. 1).
Table 1

Summary of data acquisition rate

VariablesCenter 1 (n = 40)Center 2 (n = 15)Overall (n = 55)p-value*
Valid (0 < heart rate < 200 beats/min)
Duration (hr)4.43 ± 1.095.46 ± 0.454.71 ± 1.060.0008
Proportion (%)73.55 ± 18.0587.66 ± 9.3877.40 ± 17.270.0058
Invalid
Duration (hr)0.11 ± 0.150.04 ± 0.050.09 ± 0.140.0930
Proportion (%)1.76 ± 2.560.56 ± 0.821.42 ± 2.270.0831
Missing
Duration (hr)1.50 ± 1.150.78 ± 0.741.30 ± 1.090.0294
Proportion (%)24.77 ± 18.8711.77 ± 9.1421.23 ± 17.690.0138

Values are presented as means ± standard deviation.

*Student’s t-test.

Figure 1

Transmission status of the heart rates for each subject. Subjects from center 1 have subject numbers starting with ‘S’, and center 2 starts with ‘K’.

Values are presented as means ± standard deviation. *Student’s t-test.

Comparison of heart rates

Most of the valid heart rates were within the normal range of heart rates (60–100 beats/min) in both centers. The distribution of heart rates was right-skewed in both centers (Fig. 2). Heart rates measured by the wearable ECG (VP-100) and conventional pulse oximetry showed a poor concordance (ICC = 0.0454, Fig. 3). The similar results were demonstrated in Bland–Altman plots. The difference between the 2 measurements in several subjects were not within the limits of agreement which were set as mean difference ± twice of the SD of difference. Bland–Altman analysis results showed that heart rates measured by the wearable ECG were 13.6 bpm higher than their counterparts. Heart rates measured by the wearable ECG had a wider range than the rates measured using pulse oximetry.
Figure 2

Distribution of the heart rates (A) and proportion of missing data (B) according to each center. Invalid heart rates are aggregated and marked as black bars in (A).

Figure 3

Comparison of the heart rates between portal electrocardiogram (VP-100) and conventional pulse oximetry (A): scatter plot (A) and Bland–Altman analysis results (B). The solid line in (A) represents the line of unity, and the dashed line represents the regression line. The solid line in (B) indicates the mean of the difference, and dashed lines indicate the upper and lower boundaries of the 95% confidence intervals. The difference between the 2 measurements in several subjects were not within the limits of agreement, indicating poor agreement between the 2 devices. ICC based on a 2-way random effects model.

ICC, intraclass correlation coefficient.

Comparison of the heart rates between portal electrocardiogram (VP-100) and conventional pulse oximetry (A): scatter plot (A) and Bland–Altman analysis results (B). The solid line in (A) represents the line of unity, and the dashed line represents the regression line. The solid line in (B) indicates the mean of the difference, and dashed lines indicate the upper and lower boundaries of the 95% confidence intervals. The difference between the 2 measurements in several subjects were not within the limits of agreement, indicating poor agreement between the 2 devices. ICC based on a 2-way random effects model.

ICC, intraclass correlation coefficient.

Patient experience

The subjects’ experience of the ease of using the device was generally positive. The means and SD of the scores for the items “The device was easy to use,” “It was easy for me to learn to use the device” and “It was simple to link the device and its mobile app” were 5.8 (1.3), 6.0 (1.3) and 5.6 (1.5), respectively. Opinions on the overall design and usability in social settings were neutral. The means and SD of the scores on the items “The overall design of the device was favorable” and “I feel comfortable using this device in social settings” were 4.9 (1.6) and 4.8 (1.8), respectively. The subjects’ experience with the application was comparable to the device except for the item “I could use the app even when the internet connection was poor or not available.” The mean and SD of the scores on the item was 3.7 (1.8), in which the proportion of negative opinions was greater than the positive opinions (Table 2).
Table 2

Summary of the user experience

ItemsFrequency
1. Strongly disagree2. Somewhat disagree3. Disagree4. Neutral5. Agree6. Somewhat agree7. Strongly agreeMissingTotal
Device
The device was easy to use.0 (0.0)1 (1.8)3 (5.5)4 (7.3)12 (21.8)12 (21.8)23 (41.8)0 (0.0)5.8 ± 1.3
It was easy for me to learn to use the device.0 (0.0)1 (1.8)2 (3.6)6 (10.9)5 (9.1)15 (27.3)26 (47.3)0 (0.0)6.0 ± 1.3
Whenever I made a mistake using the device, I could recover easily and quickly.2 (3.6)2 (3.6)4 (7.3)11 (20.0)11 (20.0)10 (18.2)13 (23.6)2 (3.6)5.1 ± 1.6
It was simple to link the device and its mobile app.1 (1.8)1 (1.8)2 (3.6)9 (16.4)10 (18.2)11 (20.0)20 (36.4)1 (1.8)5.6 ± 1.5
I felt comfortable while wearing the device.0 (0.0)0 (0.0)6 (10.9)9 (16.4)8 (14.5)13 (23.6)19 (34.5)0 (0.0)5.5 ± 1.4
The overall design of the device was favorable.1 (1.8)2 (3.6)6 (10.9)18 (32.7)10 (18.2)4 (7.3)14 (25.5)0 (0.0)4.9 ± 1.6
Over time, the electrode of the device was well-attached to the skin.2 (3.6)3 (5.5)2 (3.6)12 (21.8)6 (10.9)12 (21.8)17 (30.9)1 (1.8)5.2 ± 1.7
The battery was enough for use.1 (1.8)2 (3.6)5 (9.1)8 (14.5)13 (23.6)7 (12.7)18 (32.7)1 (1.8)5.3 ± 1.6
I feel comfortable using this device in social settings.2 (3.6)5 (9.1)5 (9.1)10 (18.2)12 (21.8)8 (14.5)13 (23.6)0 (0.0)4.8 ± 1.8
Application
The app was easy to use.0 (0.0)1 (1.8)4 (7.3)8 (14.5)7 (12.7)14 (25.5)21 (38.2)0 (0.0)5.7 ± 1.4
It was easy for me to learn to use the app.0 (0.0)1 (1.8)2 (3.6)7 (12.7)8 (14.5)15 (27.3)22 (40.0)0 (0.0)5.8 ± 1.3
The navigation was consistent when moving between screens.0 (0.0)3 (5.5)3 (5.5)7 (12.7)12 (21.8)13 (23.6)17 (30.9)0 (0.0)5.5 ± 1.5
The interface of the app allowed me to use all the functions (such as entering information, responding to reminders, viewing information) offered by the app.0 (0.0)1 (1.8)3 (5.5)15 (27.3)15 (27.3)11 (20.0)9 (16.4)1 (1.8)5.1 ± 1.2
Whenever I made a mistake using the app, I could recover easily and quickly.2 (3.6)2 (3.6)6 (10.9)8 (14.5)13 (23.6)10 (18.2)13 (23.6)1 (1.8)5.0 ± 1.6
I like the interface of the app.0 (0.0)0 (0.0)5 (9.1)9 (16.4)11 (20.0)15 (27.3)14 (25.5)1 (1.8)5.4 ± 1.3
The information in the app was well organized, so I could easily find the information I needed.0 (0.0)0 (0.0)4 (7.3)11 (20.0)11 (20.0)13 (23.6)14 (25.5)2 (3.6)5.4 ± 1.3
The app adequately acknowledged and provided information to let me know the progress of my action.1 (1.8)0 (0.0)3 (5.5)6 (10.9)11 (20.0)17 (30.9)16 (29.1)1 (1.8)5.6 ± 1.3
I feel comfortable using this app in social settings.3 (5.5)3 (5.5)2 (3.6)10 (18.2)13 (23.6)12 (21.8)12 (21.8)0 (0.0)5.0 ± 1.7
The amount of time involved in using this app has been fitting for me.0 (0.0)0 (0.0)4 (7.3)9 (16.4)7 (12.7)20 (36.4)15 (27.3)0 (0.0)5.6 ± 1.3
I would use this app again.0 (0.0)0 (0.0)6 (10.9)15 (27.3)12 (21.8)10 (18.2)11 (20.0)1 (1.8)5.1 ± 1.3
Overall, I am satisfied with this app.0 (0.0)0 (0.0)2 (3.6)14 (25.5)13 (23.6)10 (18.2)15 (27.3)1 (1.8)5.4 ± 1.3
The app would be useful for my health and well-being.0 (0.0)1 (1.8)1 (1.8)8 (14.5)12 (21.8)11 (20.0)22 (40.0)0 (0.0)5.8 ± 1.3
The app improved my access to health care services.0 (0.0)1 (1.8)1 (1.8)5 (9.1)11 (20.0)13 (23.6)24 (43.6)0 (0.0)5.9 ± 1.2
The app helped me manage my health effectively.0 (0.0)1 (1.8)1 (1.8)8 (14.5)10 (18.2)14 (25.5)21 (38.2)0 (0.0)5.8 ± 1.3
This app has all the functions and capabilities I expect it to have.1 (1.8)1 (1.8)4 (7.3)13 (23.6)13 (23.6)12 (21.8)11 (20.0)0 (0.0)5.1 ± 1.4
I could use the app even when the internet connection was poor or not available.9 (16.4)7 (12.7)5 (9.1)16 (29.1)9 (16.4)5 (9.1)3 (5.5)1 (1.8)3.7 ± 1.8
This app provided an acceptable way to receive health care services, such as accessing educational materials, tracking my own activities, and performing self-assessment.0 (0.0)1 (1.8)4 (7.3)13 (23.6)8 (14.5)13 (23.6)16 (29.1)0 (0.0)5.4 ± 1.4

Values are presented as the number of responders (percentage of responders) or mean ± standard deviation.

Values are presented as the number of responders (percentage of responders) or mean ± standard deviation.

DISCUSSION

The present study obtained approximately 77% valid data during the 6-hour evaluation from 55 subjects. There was a significant difference in the data acquisition rate between the 2 centers. The proportion of missing data showed the largest difference. The correlation of the heart rates between wearable ECGs and conventional pulse oximetry was low due to several inappropriately estimated heart rates in wearable ECGs. Subjects’ experiences with the device were generally positive except for use of the application during the intermittent loss of internet connection. The distribution of heart rates in our study was comparable to a previous study that evaluated 2 wearable ECGs in phase 1 clinical trial settings [8]. We found that most of the heart rates over 150 beats/min were associated with artifacts (Fig. 4), which was also reported in a previous study [8]. These artifacts are challenging issues in wearable ECGs and may be confused with clinically significant arrhythmias, such as atrial fibrillations [16].
Figure 4

Sample wearable electrocardiogram strips with sufficient quality (A) and poor quality (B).

ECG, electrocardiogram.

Sample wearable electrocardiogram strips with sufficient quality (A) and poor quality (B).

ECG, electrocardiogram. Artifacts in ECGs may be classified into physiological and nonphysiological by the source of artifacts [17]. Physiological sources include electromyographic and epidermal signals, and nonphysiological counterparts include power line interference and motion artifacts [17]. We found that many artifacts in our study were often associated with moderate daily activities, as shown in Fig. 5. This finding is consistent with a study showing that the accuracy of real-time heart monitoring declined with the intensity of exercise [18].
Figure 5

Visualization of the records of a subject in center 2. Dots represent heart rates estimated per second. Invalid heart rates are marked as red dots. Gray and blue shades indicate the time intervals when data were missing and the subject performed physical activity, respectively.

These findings strongly suggest that artifact rejection methods should be integrated into wearable ECGs. Although VP-100 satisfied the IEC 6061-2-25 standard, the artifact rejection was sufficient. The bandpass filter implemented in the wearable ECG [19] limitedly processed baseline signals and 60-Hz power line interference. However, electromyographic signals or motion artifacts were not properly processed. These issues resulted in improper or loss of heart rates. Desirable features of wearable ECGs include a reasonable form factor. Wearable ECGs should show stable adhesion with minimal discomfort. An FDA-cleared patch ECG, ZIO Patch (iRhythm Technologies, Inc., San Francisco, CA, USA) weighs 34 g with 14 days of data storage capacity [20]. The ZIO patch was fully attached to the chest wall and provided stable adhesion [20]. These form factors of the ZIO patch may be associated with high satisfaction from patients (93.7%) [21]. In contrast, with a weight of 50 g, the VP-100 was dependent on adhesion from disposable electrodes. These factors may lead to intermittent detachment, which was often reported by subjects in the study. Enhancement of form factors and localization systems [22] may be another necessary element to improve data quality. The discrepancy in data acquisition between the 2 centers was another significant finding. We supposed that the absence of an automatic reconnecting function was one possible cause. Several missing data were attributed to detachment of the device or loss of connection to the mobile phone. Because the missing connection was recovered manually by the investigators in each center, the difference in the degree of engagement in each center may have resulted in a difference in the duration of missing data. Other possible causes include different network settings in each center or the heterogeneity of subject populations. The finding was aligned with negative opinions on the item “I could use the app even when the internet connection was poor or not available.” in the questionnaire. Discomfort in using wearable devices is also an important concern in implementing wearable devices [23]. User acceptance on wearable devices varies according to the anatomical location and weight of the devices [24]. We found that subjects’ experience on the VP-100 was neutral (mean score of 4.8). This result may be attributed to the uncomfortable anatomical position (anterior chest wall) of attachment. Our study had some major limitations. The subject population consisted of healthy male adults, which limits the generalizability of the results. Comparison of the heart rates between conventional devices may be biased by the small number of measurements. A longer duration of evaluation would be required. The incorporation of artifact processing and arrhythmia detection algorithms is also needed in further investigations. We evaluated possible issues in the implementation of wearable ECGs in multicenter settings in detail. Because the evaluation methods of wearable ECGs in clinical trials were considerably heterogeneous [7], the results of our study contribute to the selection of appropriate validation criteria. In conclusion, we evaluated the multicenter feasibility of implementing wearable ECGs. The results suggest that systems to mitigate multicenter discrepancies and remove artifacts should be implemented prior to performing a clinical trial.
  22 in total

1.  Comparison of 24-hour Holter monitoring with 14-day novel adhesive patch electrocardiographic monitoring.

Authors:  Paddy M Barrett; Ravi Komatireddy; Sharon Haaser; Sarah Topol; Judith Sheard; Jackie Encinas; Angela J Fought; Eric J Topol
Journal:  Am J Med       Date:  2013-10-15       Impact factor: 4.965

2.  Lessons learned conducting a multi-center trial with a military population: The Tinnitus Retraining Therapy Trial.

Authors:  Roberta W Scherer; Leonora D Sensinger; Benigno Sierra-Irizarry; Craig Formby
Journal:  Clin Trials       Date:  2018-05-23       Impact factor: 2.486

3.  Heart rate measures from the Apple Watch, Fitbit Charge HR 2, and electrocardiogram across different exercise intensities.

Authors:  Elizabeth A Thomson; Kayla Nuss; Ashley Comstock; Steven Reinwald; Sophie Blake; Richard E Pimentel; Brian L Tracy; Kaigang Li
Journal:  J Sports Sci       Date:  2019-01-18       Impact factor: 3.337

4.  Wearable Devices for Cardiac Rhythm Diagnosis and Management.

Authors:  James E Ip
Journal:  JAMA       Date:  2019-01-29       Impact factor: 56.272

5.  Mobile Photoplethysmographic Technology to Detect Atrial Fibrillation.

Authors:  Yutao Guo; Hao Wang; Hui Zhang; Tong Liu; Zhaoguang Liang; Yunlong Xia; Li Yan; Yunli Xing; Haili Shi; Shuyan Li; Yanxia Liu; Fan Liu; Mei Feng; Yundai Chen; Gregory Y H Lip
Journal:  J Am Coll Cardiol       Date:  2019-09-02       Impact factor: 24.094

6.  Current perspectives on wearable rhythm recordings for clinical decision-making: the wEHRAbles 2 survey.

Authors:  Martin Manninger; David Zweiker; Emma Svennberg; Sofia Chatzikyriakou; Nikola Pavlovic; Junaid A B Zaman; Bratislav Kircanski; Radoslaw Lenarczyk; Philippe Vanduynhoven; Jedrzej Kosiuk; Tatjana Potpara; David Duncker
Journal:  Europace       Date:  2021-07-18       Impact factor: 5.214

7.  Designing wearable computing devices for improved comfort and user acceptance.

Authors:  Huiju Park; Jie Pei; Mengyun Shi; Qinwen Xu; Jintu Fan
Journal:  Ergonomics       Date:  2019-09-03       Impact factor: 2.778

8.  Electrocardiogram signal quality measures for unsupervised telehealth environments.

Authors:  S J Redmond; Y Xie; D Chang; J Basilakis; N H Lovell
Journal:  Physiol Meas       Date:  2012-08-17       Impact factor: 2.833

9.  Effect of a Home-Based Wearable Continuous ECG Monitoring Patch on Detection of Undiagnosed Atrial Fibrillation: The mSToPS Randomized Clinical Trial.

Authors:  Steven R Steinhubl; Jill Waalen; Alison M Edwards; Lauren M Ariniello; Rajesh R Mehta; Gail S Ebner; Chureen Carter; Katie Baca-Motes; Elise Felicione; Troy Sarich; Eric J Topol
Journal:  JAMA       Date:  2018-07-10       Impact factor: 56.272

10.  Accuracy of Consumer Wearable Heart Rate Measurement During an Ecologically Valid 24-Hour Period: Intraindividual Validation Study.

Authors:  Benjamin W Nelson; Nicholas B Allen
Journal:  JMIR Mhealth Uhealth       Date:  2019-03-11       Impact factor: 4.773

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.