| Literature DB >> 35979254 |
Suzanne Nobrega1, Mazen El Ghaziri1, Lauren Giacobbe1, Serena Rice1, Laura Punnett1, Kasper Edwards2.
Abstract
Focus groups are often used for qualitative investigations. We adapted a published focus group method for evaluating impact of an organizational intervention for virtual delivery using video conferencing. The method entailed convening small groups of three to five participants for a 2-hour facilitated workshop. We delivered the virtual workshops, adding qualitative evaluation with researchers and participants, to assess the effectiveness of the protocol. We address the questions of how to structure the data collection procedures; whether virtual delivery permits cross participant interactions about a studied intervention; and how easy and comfortable the experience was for participants. Participants were university faculty members who were the focus of an institutional diversity program. The results indicated that the virtually delivered focus group workshop could be successfully implemented with strong fidelity to the original protocol to achieve the workshop goals. The workshops generated rich data about the impacts of the institutional program as well as other events and conditions in the working environment that were relevant to consider along with the observed program outcomes. A well-planned virtual focus group protocol is a valuable tool to engage intervention stakeholders for research and evaluation from a distance. Video conferencing is especially useful during the current COVID-19 pandemic, but also whenever geography separates researchers and evaluators from program stakeholders. Careful planning of privacy measures for a secure online environment and procedures for structured facilitation of group dialogue are critical for success, as in any focus group. This article addresses a gap in the literature on feasibility and methodology for using video conference technology to conduct qualitative data collection with groups.Entities:
Keywords: case study; effect modifiers; focus groups; methods in qualitative inquiry; narrative analysis; online interviews; organizational context; organizational setting; program evaluation; qualitative evaluation; virtual environments
Year: 2021 PMID: 35979254 PMCID: PMC9380589 DOI: 10.1177/16094069211019896
Source DB: PubMed Journal: Int J Qual Methods ISSN: 1609-4069
Figure 1.Chronological timeline of participant-reported events on a whiteboard.
Figure 2.Video conference adaptation procedure for EMA focus groups.
Figure 3.Chronological timeline of participant-reported events on cloud-based spreadsheet, with two examples of “event-notes.”
Virtual Workshop Process Evaluation Questions.
| Evaluation Questions Asked of Participants Following Each Workshop |
|---|
|
|
| • How comfortable were you participating in a focus group online with respect to privacy and security? |
| • How easy or difficult did you find the discussion with respect to interacting with the facilitator and other participants? |
| • Which specific aspects about the facilitation, if any, made it easy or difficult to respond to the questions and share your ideas? |
| • If you had an opportunity to choose now between online and in-person focus group, which would you choose and why? |
| • Do you have suggestions about how to improve the participant experience? |
Participant Demographics in Three EMA Workshops.
| EMA Workshop | Years Employed at University Mean Years (Range) | Tenured N (%) | Female N (%) | Total N (%) |
|---|---|---|---|---|
|
| ||||
| 1 | 2 (1–5) | 0 (0%) | 2 (50%) | 4 (29%) |
| 2 | 12 (2–25) | 3 (60%) | 1 (20%) | 5 (36%) |
| 3 | 9 (2–19) | l (20%) | 4 (80%) | 5 (36%) |
| Total (Mean) | 8 (1–25) | 4 (29%) | 7 (50%) | 14 |
Virtual Adaptations of the EMA Focus Group Protocol.
| EMA In-Person | EMA Virtual | Reason for Change |
|---|---|---|
|
| ||
|
| ||
|
| ||
| Physical Meeting Space | Virtual meeting space using Zoom software | The global pandemic forced social distancing, requiring a move to virtual platform. |
|
| ||
| Timeline drawn on a whiteboard, question prompts on a flip chart. Supplies: markers, large Post-it note cards | Timeline created in a cloud-based spreadsheet, question prompts on slides. Equipment: computer, speaker, microphone | An on-screen spreadsheet allowed researchers to plot responses along a timeline in an organized fashion in real time while sharing screen. |
|
| ||
| Discussed informed consent in the session. | Sent advance email about informed consent, supplies. | The pilot test indicated the need to send advance information about logistics, supplies needed (paper and pencil), and privacy protection procedures. |
|
| ||
|
| ||
| One facilitator processed pre-written notecard responses; one observer records notes on group interactions. | Two co-facilitators process responses (see below); one observer monitored chat and time, took notes on group interactions. | Allowed one co-facilitator to focus on participant interactions while the other focused on accurately typing responses in the cloud-based spreadsheet. |
|
| ||
| Question prompts and scoring instructions given verbally or on a flip chart. | Question prompts and scoring instructions given verbally and presented using a slide show. | Slides made it easier for participants to recall and comprehend the instructions. Participants in the pilot phase had difficulty retaining instructions when delivered verbally only. |
|
| ||
| Facilitator invited responses in no particular order. | Facilitator invited responses in the order participants appeared on screen. | Streamlined the processing of responses to progress efficiently and minimize participants talking over one another. |
|
| ||
|
| ||
| Collected short written responses on large Post-it notecards, numbering each one, then placing on white board timeline. | Collected two- to five-word responses through the chat feature, then typed them, numbered, in the cloud-based spreadsheet. | Mimicked EMA protocol of processing events one at a time, but responses had to be very short to fit within a spreadsheet format. |
| Note cards with repeated responses were numbered then placed atop initial response card. | Repeated responses were assigned numbers, which were typed into the spreadsheet cell with the initial response. | Allowed visualization and quantification of all events on the timeline in an orderly manner. |
|
| ||
| Scores were added to the large Post-it notes. | Scores were typed next to responses in cells of the cloud-based spreadsheet. | Allowed visualization of the scores for participant discussion. |