Literature DB >> 30321914

Benefit of focus group discussion beyond online survey in course evaluations by medical students in the United States: A qualitative study.

Katharina Brandl1, Soniya V Rabadia2, Alexander Chang2, Jess Mandel2.   

Abstract

In addition to online questionnaires, many medical schools use supplemental evaluation tools such as focus groups to evaluate their courses. Although some benefits of using focus groups in program evaluation have been described, it is unknown whether these in-person data collection methods provide sufficient additional information beyond online evaluations to justify them. In this study we analyze recommendations gathered from student evaluation team (SET) focus group meetings and analyzed whether these items were captured in open-ended comments within the online evaluations. Our results indicate that online evaluations captured only 49% of the recommendations identified via SETs. Surveys to course directors identified that 74% of the recommendations exclusively identified via the SETs were implemented within their courses. Our results indicate that SET meetings can provide information not easily captured in online evaluations and that these recommendations result in actual course changes.

Entities:  

Mesh:

Year:  2018        PMID: 30321914      PMCID: PMC6249141          DOI: 10.3352/jeehp.2018.15.25

Source DB:  PubMed          Journal:  J Educ Eval Health Prof        ISSN: 1975-5937


The evaluation of medical school courses requires a range of methods to gain a sufficiently comprehensive view of the program [1,2]. Most medical schools use quantitative methods in the form of closedend rating scales with 1 or 2 opportunities for open-ended comments [3]. These quantitative methods are simple in design, easy to operate, and useful for obtaining information from a large number of students. However, the scope of online evaluations is limited and several studies have reported that students fill out these evaluations mindlessly [4]. As a consequence, some medical schools have implemented qualitative collection methods such as focus groups to supplement online course evaluations and to ‘tell the story’ behind closedended rating scales [5,6]. Focus groups provide space for clarifying questions and allow a face-to-face dialogue between students and faculty. In addition, focus groups can encourage student interactions that reveal issues not addressed in online evaluations and promote discussion of practical solutions. The process of organizing, conducting, and analyzing data from focus groups requires significant resources. It is unclear, however, whether these in-person qualitative methods provide sufficient additional information beyond online evaluations to warrant investment in them. Furthermore, any evaluation system must be judged on whether the results collected actually lead to curricular changes. The University of California, San Diego (UCSD) School of Medicine has implemented student evaluation team (SET) focus group meetings for the evaluation of preclerkship courses in addition to online questionnaires [5]. In this study we analyzed the recommendations gathered in SET meetings and compared them to the information captured from the open-ended comments of online evaluations. We next determined whether recommendations from SET meetings resulted in actual course changes (Fig. 1).
Fig. 1.

Flowchart of study design. SET, student evaluation team.

SET meetings were scheduled after each of the preclerkship core courses. The course director, academic deans, and approximately 16 randomly selected students who recently completed the course participated. The Assistant Dean for Educational Development and Evaluation (Doctor of Philosophy in Psychology and Master in Health Profession Education), who is not involved the coursework, facilitated these meetings. In the meetings, students considered the course as a whole and commented on “what worked well in this course and what didn’t” [5]. Notes from 9 SET meetings for second-year medical student courses (academic year 2015–2016) taken by 2 second-year medical students (S.V.R. and A.C.) were analyzed. SET meetings were scheduled on 9/25/15 for course 1, 10/12/15 for course 2, 10/23/15 for course 3, 11/2/15 for course 4, 11/30/15 for course 5, 1/4/16 for course 6, 2/19/16 for course 7, 3/7/16 for course 8, and 3/18/16 for course 9, and lasted for 1 hour each. Feedback that included potential solutions was identified in a grounded theory-based approach and coded into the following 7 categories: issues related to specific teaching modalities used in courses, the overall course content, specific lectures (content and organization), sequencing of course events, administrative course components, exams, and study materials. Open-ended comments from online questionnaires were analyzed for the same 9 preclerkship courses for second-year medical students. In these online questionnaires, a 20-item Likert-style survey was followed by a request for comments related to the course. The survey was administered after the end of each course and 714 deidentified responses from second-year medical students were collected. The overall response rate of the online questionnaires was 66%. A total of 293 comments from the online questionnaires of the 9 preclerkship courses were analyzed. Online comments corresponding to SET meeting comments were identified. During the following year (2016–2017), surveys were sent to each course director (n= 9) as their course began. These surveys asked the course director whether they had implemented the suggested changes in their course. Course directors responded to each of the recommendations with “yes,” “somewhat,” or “no.” For the quantitative analysis of course directors’ responses, a response of “yes” for a specific recommendation was considered as 100% implemented. A response of “somewhat” was considered as 50% implemented, and a response of “no” as 0% implemented. Surveys were completed on 9/12/16 for course 1, 10/9/16 for course 2, 3/13/17 for course 3, 11/7/16 for course 4, 11/18/16 for course 5, 12/6/16 for course 6, 1/27/17 for course 7, 2/27/17 for course 8, and 2/27/17 for course 9. Raw data are available from Supplement 1.

Ethic statement

The UCSD Institutional Review Board designated this study as an EBP/QI/QA (evidence-based practice, quality improvement, and quality assurance) project and therefore did not require full review (IRB approval no., 151319QI). Analysis of the SET meeting notes yielded 69 suggested course improvements that included potential solutions, which were coded into the 7 categories listed earlier (Table 1). Of the 69 issues identified via SET, online evaluations captured 34 (49%). Specifically, SETs were superior in capturing feedback regarding specific teaching modalities in courses (18% appeared in online evaluations), problems related to the overall course content (25% online), and lecture content and organization (25% online). In contrast, online evaluations captured most of the deficiencies in study materials (80%), administrative course components (67%), exam-related problems (63%), and sequencing of course events (58%).
Table 1.

Actionable recommendations (n=69) identified by SET input

CourseActionable recommendations identified in SETs (by category)Implementation[a)]Captured in online evaluations[b)]
Study materials
 1Implementing an additional practice quiz in a specific block.YesNo
 1Framing practice questions within clinical scenarios.YesYes
 1Improving the similarity of practice quiz questions and final exam questions.YesYes
 3Organizing practice quizzes by week regardless of the subject matter.YesYes
 3Including detailed explanations for the […] question on the practice quiz.YesYes
 4Increasing the number of practice quiz questions to ≥ 25.SomeNo
 4Providing practice quizzes as PDFs as well as on the online exam software.NoYes
 4Specifying the relevant lecture within each practice quiz question explanation.SomeYes
 6Increasing the resolution of specific images on the practice quizzes.YesYes
 7Adding more practice questions to a specific course syllabus.YesYes
Sequencing of events/coordinating events
 1Adjusting the scheduling of a patient presentation that includes difficult psychosocial interactions and sensitive topics to allow adequate time for reflection.NoNo
 1Adjusting the schedule to provide any integrative reviews of a complex topic immediately after that topic’s presentation in lecture, rather than at the end of the course.SomeYes
 2Adjusting the schedule to provide a full day for processing between a complex topic presented in lecture and small group exercises on the same topic.YesYes
 3Improving coordination between the physical exam in the practice of medicine course with the corresponding material in the relevant block.YesNo
 3Ensuring that diseases presented in clinical skills sessions are described in lectures prior to those sessions.SomeNo
 3Adjusting the schedule to prevent the placement of a flipped lecture onto a lecture-heavy day, especially if the following day includes TBL on that material.SomeYes
 4Improving the scheduling of the laboratory sessions.YesNo
 5Adjusting the schedule to avoid 4-hour lecture blocks the day before the final exam.YesYes
 6Ensuring that conditions presented in case studies sessions are described in lectures prior to those sessions.YesYes
 6Adjusting the lecture schedule to avoid presenting particularly complex and difficult concepts immediately before the final exam.YesYes
 8Scheduling all pathology lectures after the relevant pathophysiology lectures.SomeNo
 8Adjusting the lecture schedule to reflect the topic sequence within the recommended textbook of the course.SomeYes
Course (administrative component)
 1Posting of lecture notes beforehand to enable the students to download in time.YesNo
 1Clarifying the learning objectives on […].YesYes
 1Improving the clarity of acronyms used in lectures and proofreading lecture notes for typos.SomeYes
 1Clarifying specific details on the drug list posted on the course website.SomeYes
 2Encouraging lecturers to maintain timing of lectures to 50 minutes.YesYes
 4Improving the correlation of the drug list with the lecture material.YesYes
 6Clarifying the relative importance of textbook material early in the course.YesYes
 6Requiring lecturers to finish within their allotted time.YesYes
 7Adding space on small group handout to allow annotations.YesNo
 7Posting the drug list for […] on the course website.YesNo
 8Focusing reports of negative performance to students rather than including deans and other administrators.YesNo
 8Providing clear expectations on textbook reading (supplemental versus required).SomeYes
Course content
 1Providing more emphasis for the importance of high-yield facts during the introduction lecture.YesNo
 1Adding an introductory lecture in a specific block.SomeNo
 2Adding definitions of […] to the lecture slides.YesNo
 2Include an integrative overview figure (favorite figure) to guide students to differentiate between different tests.YesNo
 2Adding organization to lectures and start with an important overview before adding details.SomeYes
 2Adding specific sessions within this course on how to write research papers.SomeYes
 3Adding the presentation of drugs into the […] lecture.SomeNo
 4Adding the discussion of specific diseases […] that are important for USMLE step 1.SomeNo
Exams
 2Providing calculators on computerized exams other than the calculator embedded in the exam software.YesNo
 3Adding more images to a specific portion of the exam.YesYes
 3Adding explanations to the […] portion in the exam review session.YesYes
 4Providing images with higher resolution on the computerized exam.SomeNo
 4Matching the difficulty of practice questions with actual exam questions.YesYes
 8Modifying the questions to reflect USMLE guidelines for multiple choice questions and increasing the use of clinical vignettes in question stems.YesYes
 8Ensuring that images used on the computerized exam are high-resolution.SomeYes
 9For each exam question that requires multiple lab values or a common clinical vignette, ensure that each question provides the needed information.YesNo
Lecture organization/content (specific lectures)
 1Increasing the emphasis of general concepts rather than small details in the […] lecture.YesNo
 1Matching the […] lecture content with the learning objectives.SomeNo
 3Adding more opportunities for interactive engagement and expanding on the pathophysiology of the […] lecture.YesNo
 3Improving the organization of the […] lecture.SomeNo
 3Eliminating duplicative material of the […] lecture.NoYes
 4Reducing the amount of slides in the […] lecture.YesNo
 5Reducing the research background in the […] lecture and increasing its clinical relevance.YesNo
 8Improving the organization for the […] lecture.YesYes
Specific teaching modalities
 1Providing PowerPoint summary slides for the small groups to minimize the impact of facilitator variability between the groups.NoNo
 2Assigning a specific time in the beginning of the small group exercise for students to review the paper.NoYes
 3For in-class problem-solving sessions, posting detailed answers immediately after class.YesNo
 3Increasing the interactive component of the […] sessions.YesNo
 4Changing teaching modalities (small group activities should be replaced by TBL sessions).SomeYes
 5Eliminating slides that only show pathologic tissues, rather providing slides with additional information.NoNo
 6For case study problems, ensuring that each problem is formatted so it can be used later as a practice question, by providing the question with the answer on the following slide.YesNo
 7Improving facilitator training to ensure that proper etiquette is enforced in all small group sessions.YesNo
 7Improving facilitator training to ensure each facilitator provides an adequate overview of the disorder and associated pharmacology.YesNo
 8Improving the case vignettes during the laboratory sessions.SomeNo
 9Adding detailed explanations to the small group.NoNo

SET, student evaluation team; TBL, team-based learning; USMLE, United States Medical Licensing Examination.

Course directors revealed whether the suggested change was fully implemented (yes, n=41), somewhat implemented (some, n=21), or not implemented (no, n=7).

Actionable recommendations were captured (yes, n=34) or not captured (no, n=35) by open-ended comments from online evaluations.

Survey data from the course directors identified that 74% of the recommendations captured exclusively in SETs (in contrast to online evaluations) translated into course changes (26 of 35). Table 1 lists all suggested improvements and indicates whether each item was implemented by the course director and captured in the open-ended comments from the online evaluations. Evaluation is an integral part of medical education, and many tools are available to comprehensively characterize a program. One major purpose of collecting evaluations is to guide instructional improvement. Our analysis revealed that 74% of the SET-identified actionable items translated into course changes, implying that the focus groups served as a catalyst for discrete course adjustments. Studies have suggested that written comments may provide useful information that go beyond that of numerical ratings generated by closedended Likert-style questionnaires [7]. However, 2 major problems are associated with open-ended comments. First, interpreting students’ comments is not an easy task and there are no opportunities for asking clarifying questions. Second, open-ended comments often lack specificity and contextual factors [7]. Implementing a focus group as part of the evaluation process addresses both shortcomings. SET meetings facilitate negotiation, listening, and responding. Recommendations suggested by students in these meetings are discussed with faculty and deans in a collaborative dialogue. Students can explain proposed solutions and avoid confusion or misjudgments from faculty. In contrast to online open-ended comments, SET evaluations were rich in specific suggestions for improvement and also often included a contextual factor. Most importantly, our results indicate that suggestions identified in the SET meetings met the gold standard for evaluation comments—they actually led to course changes. The ‘give-and-take’ from multiple stakeholders in a course can best facilitate this process. No single evaluation tool will capture the entirety of all potentially useful feedback. The choice of an evaluation model should no longer be a treasure hunt for the one perfect evaluation model. It should be viewed as an ‘all of the above’ approach, rather than a ‘best single answer’ choice. Our data indicate that including open-ended focus groups can provide rich solution-based feedback that makes this a worthwhile tool to add to the evaluation toolbox.
  5 in total

1.  A curious case of the phantom professor: mindless teaching evaluations by medical students.

Authors:  Sebastian Uijtdehaage; Christopher O'Neal
Journal:  Med Educ       Date:  2015-09       Impact factor: 6.251

2.  The Student Curriculum Review Team: How we catalyze curricular changes through a student-centered approach.

Authors:  Katie W Hsih; Mark S Iscoe; Joshua R Lupton; Tyler E Mains; Suresh K Nayar; Megan S Orlando; Aaron S Parzuchowski; Mark F Sabbagh; John C Schulz; Kevin Shenderov; Daren J Simkin; Sharif Vakili; Judith B Vick; Tim Xu; Ophelia Yin; Harry R Goldberg
Journal:  Med Teach       Date:  2014-12-23       Impact factor: 3.650

3.  Student evaluation team focus groups increase students' satisfaction with the overall course evaluation process.

Authors:  Katharina Brandl; Jess Mandel; Babbi Winegarden
Journal:  Med Educ       Date:  2016-12-05       Impact factor: 6.251

4.  What else is happening? A more holistic view of programme evaluation.

Authors:  Katharina Brandl; Jess Mandel
Journal:  Med Educ       Date:  2018-04       Impact factor: 6.251

5.  Changing medical students' perception of the evaluation culture: Is it possible?

Authors:  Jorie M Colbert-Getz; Steven Baumann
Journal:  J Educ Eval Health Prof       Date:  2016-02-15
  5 in total
  1 in total

Review 1.  Student evaluations of teaching and the development of a comprehensive measure of teaching effectiveness for medical schools.

Authors:  Constantina Constantinou; Marjo Wijnen-Meijer
Journal:  BMC Med Educ       Date:  2022-02-19       Impact factor: 2.463

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.