| Literature DB >> 35333191 |
Danielle Mollie Stambler1, Erin Feddema2, Olivia Riggins1, Kari Campeau3, Lee-Ann Kastman Breuch1, Molly M Kessler1, Stephanie Misono2.
Abstract
BACKGROUND: Web-based health interventions are increasingly common and are promising for patients with voice disorders because web-based participation does not require voice use. To address needs such as Health Insurance Portability and Accountability Act compliance, unique user access, the ability to send automated reminders, and a limited development budget, we used the Research Electronic Data Capture (REDCap) data management platform to deliver a patient-facing psychological intervention designed for patients with voice disorders. This was a novel use of REDCap.Entities:
Keywords: REDCap; eHealth; health; heuristics; online; patients; usability study; voice disorders; web-based intervention; web-based participation; web-based platform
Year: 2022 PMID: 35333191 PMCID: PMC8994149 DOI: 10.2196/26461
Source DB: PubMed Journal: JMIR Hum Factors ISSN: 2292-9495
Figure 1Welcome page for the Voice Education Program intervention.
Figure 3Voice tips page including embedded YouTube video.
Design parameters used in developing the REDCap intervention.
| Category and required design parameter | REDCap featuresa | ||
|
|
| ||
|
| Intervention accessible through links sent by email. Reminder emails sent with the deadlines for finishing the module. | REDCap generated unique URLs for each participant when sent by | |
|
| Participants have unique logins to access their information. Participants can save, leave, and return to the website. | Enabled Save & Return in the | |
|
| Data must be kept in a secure, HIPAAb compliant database. | REDCap provided a secure web interface and included multiple features to support HIPAA compliance. Both the website and database were housed on secure servers maintained by the researcher’s institution. | |
|
|
| ||
|
| Educational material able to be delivered by text and videos. | Used | |
|
| Self-led exercises present participants’ prior responses for reflection and goal-setting. | Previous responses were | |
|
| The length of intervention can be shortened if the participant's therapy start date is less than 3 weeks away. Module start and completion dates are otherwise used to enable future modules and trigger emails to send. | Therapy date was entered in the participant set up instrument. Start and completion dates were captured as validated date text variables using the | |
|
| Intervention modules are disabled after a period of time to ensure that they are completed in order. | ||
|
| Hyperlinks to voice and psychological health tips are embedded in the intervention check-ins. Menu and navigation have hyperlinks to resources including the study FAQ, program references, and supplemental mental health resources. | Hardcoded hyperlink embedded in a survey field connected to another REDCap project using the project’s public link. Because a left-hand menu bar was not possible in REDCap and layout was limited to one center panel of text, the menu links were listed in a descriptive text variable at the bottom of each survey page. | |
|
|
| ||
|
| The intervention can adapt to changing participant inputs throughout participant, such as: | A project instrument was used for participant setup to enter dates used for | |
|
| The intervention moves through modules in sequence. | Default use of the | |
|
| Participants cannot go backwards and change answers, which is important for data integrity. | Instrument | |
|
| Website collects date and time stamps for all responses. | REDCap captured and could report timestamps for survey start, completion, and all responses. | |
aREDCap-specific terms are in italics.
bHIPAA: Health Insurance Portability and Accountability Act.
Heuristic evaluation results.
| Heuristic | Rating | Strengths | Areas for improvement |
| 1. Visibility of system status | 3 |
System showed current status effectively in main survey sections via page counts, color confirmation, and written confirmation |
System status and future options were not as apparent in additional help sections |
| 2. Real-world match | 2 |
Survey section numbering, sequence, and naming were logical and consistent with real world conventions Survey questions follow conventions for type and format Embedded YouTube videos take advantage of familiar features, platform |
Procedure for leaving and returning was not conventional or natural “Survey” terms in standardized research questionnaires did not match real-world conventions Additional section links are hard to find, and function in unconventional and nonnatural ways |
| 3. User control | 2 |
Reset function was an effective undo feature |
“Emergency exits” were unclear in additional help sections |
| 4. Consistency | 2 |
Main survey sections used consistent layout, functioning, color, and terms |
Pop-up boxes, formatting, headers, and tone were inconsistent within and across pages |
| 5. Error prevention | 3 |
Several effective error prevention features (eg, prohibits leaving questions unanswered) |
No prevention against accidentally closing whole survey window without saving |
| 6. Recognition | 3 |
Instructions for system use are readily available throughout Questionnaires and check-ins provide built-in references to past information Educational videos, FAQa, and additional resources are available at the bottom of each page |
Contents and options are not centrally listed in the additional vocal health tips sections |
| 7. Flexibility | 3 |
Font resize option is available Survey queue automatically accordions up as surveys are completed, but still provides an option to view all |
Menus with links to additional resources and vocal health tips sections cannot be hidden |
| 8. Aesthetic | 3 |
Aesthetic is simple, neutral, and uncluttered |
Some images in the additional vocal health tips sections are less relevant and therefore less impactful than they could be |
| 9. Error messaging | 3 |
Error messaging is clear and provides both an explanation and a solution |
None identified |
| 10. Help and documentation | 2 |
FAQ and help email are readily available on survey queue/home page FAQ is available on all main survey pages Instructions for use are available throughout the module |
Help is not searchable No documentation for technical issues No centralized overview of instructions, features, problems, and complex tasks |
aFAQ: frequently asked question.
Demographic characteristics of study participants.
| Characteristic | Value | ||
| Age (years), mean (range) | 51 (30-71) | ||
|
|
| ||
|
| Male | 1 (14) | |
|
| Female | 5 (71) | |
|
| Gender nonconforming | 1 (14) | |
|
|
| ||
|
| White | 5 (71) | |
|
| African American | 1 (14) | |
|
| Asian American | 1 (14) | |
|
|
| ||
|
| Some college credits, no degree | 1 (14) | |
|
| Bachelor’s degree | 1 (14) | |
|
| Graduate degree | 5 (71) | |
|
|
| ||
|
| At least once a week | 2 (28) | |
|
| At least once a month | 1 (14) | |
|
| Less than once a month | 3 (43) | |
|
| Decline to answer | 1 (14) | |
Figure 4Time-on-task, mean post-task ratings, and task completion rates.
Summary of usability issues grouped by topic.
| Category and issue | Participants encountering issue, n (%) | Overall | Individual severity ratings | Overall assessment of severity | |||||
|
|
|
|
|
| |||||
|
| Did not realize that the | 4 (57.1) | High | 1, 3, 3, 4 | Critical, high impact | ||||
|
| Unclear how to enter first survey | 1 (14.3) | Low | 1 | Critical, high impact | ||||
|
| Clicking the back button resulted in an error message | 2 (28.6) | Moderate | 1, 2 | Critical, high impact | ||||
|
| Unclear how to return to the intervention after logging out | 1 (14.3) | Low | 2 | Noncritical, moderate impact | ||||
|
|
|
|
|
| |||||
|
| Small browser window caused text to wrap, which was difficult to read | 1 (14.3) | Low | 3 | Noncritical, low impact | ||||
|
| Unable to locate health tips | 1 (14.3) | Low | 1 | Critical, high impact | ||||
|
|
|
|
|
| |||||
|
| Unsure if FAQa was the right place to look for help | 1 (14.3) | Low | 4 | Inconvenient, lowest impact | ||||
|
| Identified a different page as the FAQ | 2 (28.6) | Moderate | 3, 4 | Noncritical, low impact | ||||
|
| Would try to contact MyChart (a clinical system) for help | 1 (14.3) | Low | 1 | Critical, high impact | ||||
|
|
|
|
|
| |||||
|
| Text-heavy or wordy pages | 4 (57.1) | High | 4, 4, 4, 4 | Inconvenient, lowest impact | ||||
|
| Discomfort with psychological questions | 5 (71.4) | High | 3, 3, 4, 4, 4 | Noncritical, low impact | ||||
aFAQ: frequently asked question.
Adaptations in REDCap to address heuristic and usability findings.
| Challenges with REDCapa intervention delivery | Sample adaptation to enhance usability | ||
|
| |||
|
| Heuristic analysis recommended better distribution of white space by moving information to the footer, header, or side menus where possible. | Participant resources links and survey page instructions were moved to the Survey Footer to separate them from module related text. | |
|
| Participants recommended more branding visibility, as they appreciated the affiliation of the project with their clinic. | Aesthetics were constrained by limited options where logos could be added; combined logos were created to allow multiple entities to be represented. | |
|
| Participants found page titles confusing and recommended clearer instructions and wording about the intervention and module titles. | Headers and text were revised and simplified to clarify instructions about the study and intervention and make the intervention consistently identifiable on each page. | |
|
| |||
|
| No independent home page functionality besides using the | A site map was not possible within REDCap. Therefore, study status graphics were added to the first and last page of each module to show the participant’s progression through the intervention. | |
|
| Participants struggled to tell how far along they were in the program, as the survey queue did not show what was forthcoming when using | Page numbers were added to show progression through each module. | |
|
| Participants found that saving and returning using the randomly generated code for re-entry was nonintuitive and easy to miss when leaving a survey, making returning to the intervention difficult. | The | |
|
| Participants experienced difficulty returning to REDCap intervention pages after clicking on a hyperlink due lack of ability to link back to other instruments within a survey. | The number of embedded hyperlinks was minimized. Where hyperlinks were unavoidable, instructions were added, eg, how to navigate back to the next part of the intervention from the patient resources webpage. | |
|
| |||
|
| REDCap’s participant-facing interface was the survey format, and participants struggled with hardcoded survey labels and buttons such as | Instructions were revised to say “survey” instead of “assessment” or “questionnaire.” | |
| Using the | Visible use of | ||
| To advance, participants needed to click the | The number of instruments per module was reduced to limit the number of | ||
aREDCap-specific terms are in italics.
bFAQ: frequently asked question.