| Literature DB >> 31025951 |
Meshari F Alwashmi1, John Hawboldt1, Erin Davis1, Michael D Fetters2.
Abstract
Although patients express an interest in using mobile health (mHealth) interventions to manage their health and chronic conditions, many current mHealth interventions are difficult to use. Usability testing is critical for the success of novel mHealth interventions. Researchers recognize the utility of using qualitative and quantitative approaches for usability testing, but many mHealth researchers lack the awareness of integration approaches from advances in mixed methods research that can add value to mHealth technology. As efficient usability testing proceeds iteratively, we introduce a novel mixed methods design developed specifically for mHealth researchers. The iterative convergent mixed methods design involves simultaneous qualitative and quantitative data collection and analysis that continues cyclically through multiple rounds of mixed methods data collection and analysis until the mHealth technology under evaluation is found to work to the satisfaction of the researcher. In cyclical iterations, early development is more qualitatively driven but progressively becomes more quantitatively driven. Using this design, mHealth researchers can leverage mixed methods integration procedures in the research question, data collection, data analysis, interpretation, and dissemination dimensions. This study demonstrates how the iterative convergent mixed methods design provides a novel framework for generating unique insights into multifaceted phenomena impacting mHealth usability. Understanding these practices can help developers and researchers leverage the strengths of an integrated mixed methods design. ©Meshari F Alwashmi, John Hawboldt, Erin Davis, Michael D Fetters. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 26.04.2019.Entities:
Keywords: eHealth; mHealth; methods; mixed-methods; usability
Year: 2019 PMID: 31025951 PMCID: PMC6658163 DOI: 10.2196/11656
Source DB: PubMed Journal: JMIR Mhealth Uhealth ISSN: 2291-5222 Impact factor: 4.773
Figure 1Human-centered design activity phases (ISO, 2010).
Usability constructs and descriptions.
| Constructsa | Metrics | Description |
| Effectiveness | Time to learn and use | Time to read the scenarios and to begin performing tasks |
| Data entry time | Time to enter the data necessary for the execution of a task | |
| Tasks time | Time to accomplish given tasks | |
| Response time | Time of having the response to the requested information | |
| Time to install | Installation time of applications or its update | |
| Efficiency | Number of errors | Number of errors made while reading scenarios and during the task execution |
| Completion rate | The percentage of participants who correctly complete and achieve the goal of each task | |
| Satisfaction | Usability score | The System Usability Questionnaire |
aAdapted from Moumane et al [40].
Relevant dimensions of the mixed methods research integration.
| Integration dimensionsa | Mixed methods researchers integrate by |
| Rationale dimension | Citing a rationale for conducting an integrated mixed methods research study (eg, offsetting strengths and weaknesses, comparing, complementing or expanding, developing or building, and promoting social justice) |
| Study purpose, aims, and research questions dimension | Composing an overarching mixed methods research purpose and stating qualitative, quantitative, and mixed methods aims or multiple mixed methods aims with quantitative aims and qualitative questions |
| Research design dimension | Scaffolding the work in core (eg, convergent, exploratory sequential, and explanatory sequential), advanced (eg, intervention, case study, evaluation, and participatory), or emergent designs. |
| Sampling dimension | Sampling through the type, through the relationship of the sources of the qualitative and quantitative data (eg, identical sample, nested sample, separate samples, and multilevel samples), and through the timing (eg, same or different periods for collection of the qualitative and quantitative data) |
| Data collection dimension | Collecting both types of data with an intent relative to the mixed methods research procedures (eg, comparing, matching, diffracting, expanding, constructing a case, connecting, building, generating and validating a model, or embedding). |
| Data analysis dimension | Analyzing both types of data using intramethod analytics (eg, analyzing each type of data within the respective qualitative and quantitative methods and core integration analytics), using 1 or more core mixed methods analysis approach (eg, by following a thread, spiraling, and back-and-forth exchanges), or employing advanced mixed methods analysis (eg, qualitative to quantitative data transformation, quantitative to qualitative data transformation, creating joint displays, social network analysis, qualitative comparative analysis, repertory grid/other scale development techniques, geographic information systems mapping techniques, and iterative and longitudinal queries of the data). |
| Interpretation dimension | Interpreting the meaning of mixed findings (eg, where there are related data and drawing metainferences or conclusions based on interpreting the qualitative and quantitative findings) and examining for the fit of the 2 types of data (eg, confirmation, complementarity, expansion, or discordance). When the results conflict with each other, using procedures for handling the latter including reconciliation, initiation, bracketing, and exclusion. |
aAdapted from Fetters and Molina-Azorin [7].
Figure 2Evolution in an iterative convergent mixed methods design from qualitatively driven to quantitatively driven.
Figure 3The iterative convergent mixed methods research design.
Matching of the construct’s quantitative variables and qualitative questions in a joint display depicting mixed methods of data collection.
| Construct | Quantitative variables | Qualitative questions |
| Effectiveness | Time to learn and use | How did you learn to use the app? How can we reduce the time it takes to learn the app? What was your experience using the app? How can we reduce the time it takes to use the app? |
| Data entry time | How can we reduce the time it takes to enter the data? | |
| Tasks time | How can we reduce the time it takes to complete the task? | |
| Response time | How do you feel about the app response time? | |
| Time to install | What are your thoughts about the time it took to install the app? The time it took to pair the medical device, if applicable? | |
| Efficiency | Number of errors | What can we do to help users avoid the same error? |
| Completion rate | What can we do to enhance the completion rate? | |
| Satisfaction | Usability score | How often would you use the app? Why? Why not?; How do you feel about the complexity of the app?; How can we simplify it?; Do you have any recommendations to make the wording and interface easier to use? ; Would you need the support of a technical person to be able to use this system? How would you contact them: phone, email, or messaging? ; How did you find the integration of various functions in this app? How can we make it better?; How did you feel about the consistency of the app?; How can we simplify it?; Did you have any troubles when using the app? Where? How can we fix it? ; Did you feel confident when using the app? How can we make you more confident?; Did the app capture issues of importance to you?; Are there other ways to gather similar information? |
A joint display adapted from Kron et al’s MPathic-VR mixed methods trial comparing a virtual human simulation and a computer-based communications module that illustrates medical students’ attitudes and experiences in both trial arms.
| Domains | MPathic-VR | Computer Based Learning | Interpretation of mixed methods findings | ||
| Attitudinal item, mean (SD) | Qualitative reflection; illustrative quotes | Attitudinal item, mean (SD) | Qualitative reflection; illustrative quotes | ||
| Verbal communication | 5.02 (1.62) | “How to introduce myself without making assumptions about the cultural background of the patient and the family” | 3.89 (1.67) | “This educational module was useful for clarifying the use of SBAR and addressing ways that all members of a health care team can improve patient care through better communication skills” | Intervention arm comments suggest deeper understanding of the content than teaching using memorization and mnemonics as in the control, a difference confirmed by higher attitudinal scores |
| Nonverbal communication | 4.11 (1.85) | “Effective communication involves non-verbal facial expression like smiling and head nodding” | 2.77 (1.45) | None | Intervention arm comments address the value of learning nonverbal communication, the difference confirmed by attitudinal scores |
| Training was engaging | 5.43 (1.55) | “Reviewing the video review was a great way to see my facial expressions and it allowed me to improve on these skills the second time around” | 3.69 (1.62) | “This experience can be improved by incorporating more active participation. For example, there could have been a scenario in which we would have to select the appropriate hand-off information per SBAR guideline” | Intervention arm comments reflect engagement through the after-action review, whereas the control comments suggested the need for interaction, the difference confirmed by higher attitudinal scores |
| Effectiveness in learning to handle emotionally charged situations | 5.13 (1.48) | “I tend to try to smile more often than not in emotionally charged situations and that may result in conveying the wrong message” | 2.34 (1.35) | “I anticipate that high-stress situations where time is exceedingly crucial requires modification to the methods presented.” | Intervention arm comments indicate awareness of communication in emotionally charged situations, yet control comments indicate the need for additional training, a difference confirmed in attitudinal scores |