| Literature DB >> 28956293 |
Jennifer R Kogan1, Rose Hatala2, Karen E Hauer3, Eric Holmboe4.
Abstract
INTRODUCTION: Direct observation of clinical skills is a key assessment strategy in competency-based medical education. The guidelines presented in this paper synthesize the literature on direct observation of clinical skills. The goal is to provide a practical list of Do's, Don'ts and Don't Knows about direct observation for supervisors who teach learners in the clinical setting and for educational leaders who are responsible for clinical training programs.Entities:
Keywords: Assessment; Clinical Skills; Competence; Direct Observation; Workplace Based Assessment
Year: 2017 PMID: 28956293 PMCID: PMC5630537 DOI: 10.1007/s40037-017-0376-7
Source DB: PubMed Journal: Perspect Med Educ ISSN: 2212-2761
Criteria for strength of recommendation
| Strong | A large and consistent body of evidence |
| Moderate | Solid empirical evidence from one or more papers plus consensus of the authors |
| Tentative | Limited empirical evidence plus the consensus of the authors |
Summary of guidelines for direct observation of clinical skills for individual clinical supervisors
|
| ||
|---|---|---|
|
| ||
| 1. | Do observe authentic clinical work in actual clinical encounters | Strong |
| 2. | Do prepare the learner prior to observation by discussing goals and setting expectations including the consequences and outcomes of the assessment | Strong |
| 3. | Do cultivate learners’ skills in self-regulated learning | Moderate |
| 4. | Do assess important clinical skills via direct observation rather than using proxy information | Strong |
| 5. | Do observe without interrupting the encounter | Tentative |
| 6. | Do recognize that cognitive bias, impression formation and implicit bias can influence inferences drawn during observation | Strong |
| 7. | Do provide feedback after observation focusing on observable behaviours | Strong |
| 8. | Do observe longitudinally to facilitate learners’ integration of feedback | Moderate |
| 9. | Do recognize that many learners resist direct observation and be prepared with strategies to try to overcome their hesitation | Strong |
|
| ||
| 10. | Don’t limit feedback to quantitative ratings | Moderate |
| 11. | Don’t give feedback in front of the patient without seeking permission from and preparing both the learner and the patient | Tentative |
|
| ||
| 12. | What is the impact of cognitive load during direct observation and what are approaches to mitigate it? | |
| 13. | What is the optimal duration for direct observation of different clinical skills? | |
Summary of guidelines for direct observation of clinical skills for educators/educational leaders
|
| ||
|---|---|---|
|
| ||
| 14. | Do select observers based on their relevant clinical skills and expertise | Strong |
| 15. | Do use an assessment tool with existing validity evidence, when possible, rather than creating a new tool for direct observation | Strong |
| 16. | Do train observers how to conduct direct observation, adopt a shared mental model and common standards for assessment, and provide feedback | Moderate |
| 17. | Do ensure direct observation that aligns with program objectives and competencies (e. g. milestones) | Tentative |
| 18. | Do establish a culture that invites learners to practice authentically and welcome feedback | Moderate |
| 19. | Do pay attention to system factors that enable or inhibit direct observation | Moderate |
|
| ||
| 20. | Don’t assume that selecting the right tool for direct observation obviates the need for rater training | Moderate |
| 21. | Don’t put the responsibility solely on the learner to ask for direct observation | Moderate |
| 22. | Don’t underestimate faculty tension between being both a teacher and assessor | Tentative |
| 23. | Don’t make all direct observations high-stakes; this will interfere with the learning culture around direct observation | Moderate |
| 24. | When using direct observation for high-stakes summative decisions, don’t base decisions on too few direct observations by too few raters over too short a time and don’t rely on direct observation data alone | Strong |
|
| ||
| 25. | How do programs motivate learners to ask to be observed without undermining learners’ values of independence and efficiency? | |
| 26. | How can specialties expand the focus of direct observation to important aspects of clinical practice valued by patients? | |
| 27. | How can programs change a high-stakes, infrequent direct observation assessment culture to a low-stakes, formative, learner-centred culture? | |
| 28. | What, if any, benefits are there to developing a small number of core faculty as ‘master educators’ who conduct direct observations? | |
| 29. | Are entrustment-based scales the best available approach to achieve construct aligned scales, particularly for non-procedurally based specialties? | |
| 30. | What are the best approaches to use technology to enable ‘on the fly’ recording of observational data? | |
| 31. | What are the best faculty development approaches and implementation strategies to improve observation quality and learner feedback? | |
| 32. | How should direct observation and feedback by patients or other members of the health care team be incorporated into direct observation approaches? | |
| 33. | Does direct observation influence learner and patient outcomes? | |
Fig. 1An example of using self-regulated learning in the context of direct observation. Self-regulated learning describes an ongoing cycle of (1) planning for one’s learning (A, B, E), (2) self-monitoring during an activity and making needed adjustments to optimize learning and performance (C, D), and (3) reflecting after an activity about whether a goal was achieved or where and why difficulties were encountered (D, E)