| Literature DB >> 32226677 |
Kimberly Pedram1, Michelle N Brooks1, Carolyn Marcelo1, Nargiza Kurbanova1, Laura Paletta-Hobbs1, Adam M Garber1, Alice Wong1, Rehan Qayyum2.
Abstract
Background Medical training relies on direct observations and formative feedback. After residency graduation, opportunities to receive feedback on clinical teaching diminish. Although feedback through learner evaluations is common, these evaluations can be untimely, non-specific, and potentially biased. On the other hand, peer feedback in a small group setting or lecture format has been shown to be beneficial to teaching behaviors, however, little is known if peer observation using a standardized tool followed by feedback results in improved teaching behaviors. Therefore, the objective of this study was to examine if feedback after peer observation results in improved inpatient teaching behaviors. Methods This study was conducted at a tertiary care hospital. Academic hospitalists in the Division of Hospital Medicine developed a standardized 28-item peer observation tool based on the Stanford Faculty Development Program to observe their peers during bedside teaching rounds and provide timely feedback after observation. The tool focused on five teaching domains (learning climate, control of session, promotion of understanding and retention, evaluation, and feedback) relevant to the inpatient teaching environment. Teaching hospitalists were observed at the beginning of a two-week teaching rotation, given feedback, and then observed at the end of the rotation. Furthermore, we utilized a post-observation survey to assess the teaching and observing hospitalists' comfort with observation and the usefulness of the feedback. We used mixed linear models with crossed design to account for correlations between the observations. Models were adjusted for gender, age, and years of experience. We tested the internal validity of the instrument with Cronbach's alpha. Results Seventy (range: one to four observations per faculty) observations were performed involving 27 teaching attendings. A high proportion of teachers were comfortable with the observation (79%) and found the feedback helpful (92%), and useful for their own teaching (88%). Mean scores in teaching behavior domains ranged from 2.1 to 2.7. In unadjusted and adjusted analysis, each teaching observation was followed by higher scores in learning climate (adjusted improvement = 0.09; 95% CI = 0.02-0.15; p = 0.007) and promotion of understanding and retention (adjusted improvement = 0.09; 95% CI = 0.02-0.17; p = 0.01). The standardized observation tool had Cronbach's alpha of 0.81 showing high internal validity. Conclusions Peer observation of bedside teaching followed by feedback using a standardized tool is feasible and results in measured improvements in desirable teaching behaviors. The success of this approach resulted in the expansion of peer observation to other Divisions within the Department of Internal Medicine at our Institution.Entities:
Keywords: bedside teaching; medical education; peer observation; resident education; teaching feedback
Year: 2020 PMID: 32226677 PMCID: PMC7093940 DOI: 10.7759/cureus.7076
Source DB: PubMed Journal: Cureus ISSN: 2168-8184
Correlation between the domains of the peer observation tool
Pearson’s correlation with p-values
| Learning | Control | Promotion | Evaluation | Feedback | Mean (SD) | |
| Learning climate domain | 1.00 | 2.6 (0.32) | ||||
| Control of session domain | 0.39 (<0.001) | 1.00 | 2.7 (0.33) | |||
| Promotion of understanding and retention domain | 0.60 (<0.001) | 0.55 (<0.001) | 1.00 | 2.1 (0.55) | ||
| Evaluation domain | 0.31 (<0.001) | 0.42 (<0.001) | 0.70 (<0.001) | 1.00 | 2.3 (0.54) | |
| Feedback domain | 0.32 (0.006) | 0.27 (<0.001) | 0.42 (<0.001) | 0.30 (<0.001) | 1.00 | 2.7 (0.37) |
Figure 1Forest plot of mixed linear growth curve models without and with adjustments
*Adjusted analyses were performed including age, gender, years as attending and as covariates in regression models.
Examples of suggestions from teaching and observing hospitalists on how to improve the peer observation process
| From Teachers | From Observers |
| Do it more! | More frequent observations. |
| Observation should be for the duration of the whole rounds. | Increase clarification of observer’s role to the residents and students. |
| Learning teaching styles in a different way would be more comfortable. | Some of the "behaviors" on the observation tool are difficult to assess during a 1-hour. |
| One-hour observations may not enough to see the different styles of interaction. | Deciding with team on best day to observe when most learners on the team are there |
| The observer should remain innocuous during the observation process | Setting up scheduled time for feedback session |
| Providing a digital copy of the feedback so it is easier to keep track of | A cheat sheet on main areas of discussion during feedback. |
| Pre-rounds discussion with the faculty member to talk about what he or she wants to work on in their clinical teaching. |
Examples of comments by observing and teaching hospitalists on what they learned during a peer observation session
| From Teachers | From Observers |
| Letting residents and students lead the encounter with the patients. | Adding 30-60 second teaching pearls with as many patients as possible. Sprinkling clinical pearls throughout rounds. |
| Having medical students practice exam skills in a patient with interesting exam findings. | Minimize interruptions. Talking less may have a powerful impact. |
| Having med students to shorten their presentations while also presenting all the pertinent information. | Starting with students for interpretation of images during rounds. |
| Moving patient-related conversations to the bedside and in front of patients to allow for more direct observation of resident and students' skills. | Using students as a resource for clinical needs. |
| Identifying teaching points during rounds. | Being mindful of noting one's own limitations and instances where there is uncertainty and relaying the realities of clinical decision-making. |
| Quick review EKG and imaging during rounds. | Adding a fun game to rounds, such as "star points" for great presentations, explanations to reasoning, or interactions with patients. |
| Be more inclusive during discussions at morning rounds among the different learners. Engaging all learners. | Relaxed style on rounds which allow questions and stimulated discussion. |
| Identifying a particular patient or teaching point prior to rounds for high yield teaching. | Time management so all patients get adequate time based on the complexity of their disease. |
| Using game-like structure during rounds. | Allowing residents to discuss their plan with each other and come to a decision before "jumping in". |
| Assign literature search to resident or student during rounds to ensure topics are researched and disseminated to the team. Bring discussion on assigned literature to the bedside using a relevant clinical situation. | Letting senior resident lead the team and being less directive as an attending. |
| Asking question during rounds from learners rather than jumping in to teach. | Showing respect and thanking learners for presentations/contributions. |
| Balancing supervision and autonomy was helpful. | Displaying more enthusiasm through mannerisms or voice inflections. |
| Incorporating literature references. | Combining positive and constructive feedback. |
| When visiting a patient with contact precautions and students are staying outside the room, give them a short clinical question to look up while you are in the room. | Using more prompts and reflective questions to push learners to think and make connections, rather than asking directive questions. |
| Staying on a topic and less digression during rounds. | Reviewing teaching points from the previous day on rounds with quick questions. |