| Literature DB >> 32504446 |
John Q Young1, Rebekah Sugarman2, Jessica Schwartz2, Matthew McClure2, Patricia S O'Sullivan3.
Abstract
INTRODUCTION: Mobile apps that utilize the framework of entrustable professional activities (EPAs) to capture and deliver feedback are being implemented. If EPA apps are to be successfully incorporated into programmatic assessment, a better understanding of how they are experienced by the end-users will be necessary. The authors conducted a qualitative study using the Consolidated Framework for Implementation Research (CFIR) to identify enablers and barriers to engagement with an EPA app.Entities:
Keywords: App; Competency-based assessment; Consolidated framework for implementation research; Entrustable professional activities; Implementation science; Mobile technology; Psychiatry; Qualitative methods; Workplace-based assessment
Mesh:
Year: 2020 PMID: 32504446 PMCID: PMC7459074 DOI: 10.1007/s40037-020-00587-z
Source DB: PubMed Journal: Perspect Med Educ ISSN: 2212-2761
Facilitators and barriers to engagement with the EPA app
| CFIR domain | Facilitators | Barriers |
|---|---|---|
– Sufficient training prior to use – Few, if any, technical challenges – EPA app intuitive and easy to use, especially compared with paper-based assessment tools – Feedback timely and frequent – Feedback quality high—behaviorally specific and salient – User interface forced succinct feedback with a single take home message for the resident | – Residents and faculty see the value of assessment tools (such as the paper-based form also used in the clinic) which generate more comments that are more detailed, nuanced, and comprehensive – The absence of a checklist, while making the app easier to use, led to less systematic observation and feedback – No reinforcing comments – Most faculty did not understand the entrustment scale and/or the EPA framework – Faculty prefer paper-forms for discretely jotting down feedback points while observing | |
– Excitement about the use of app-based technology – High confidence in use of the app – Faculty appreciated how the interface forced synthesis and distillation of their observations into a single, concise feedback point | – Faculty worry that use of the EPA app during patient encounters may convey lack of respect and attention – Residents reviewed emailed feedback briefly, then rarely referred to it again – Faculty prioritized verbal feedback over app completion when short on time | |
– Faculty time protected for the sole purpose of directly observing the resident and giving feedback – Monitoring of app utilization by the program | – Clinical demands, especially from the residents’ panels of patients, often resulted in the EPA app assessment not being completed | |
| – The app aligned with the organization’s emphasis on innovation—especially regarding the use of measurement and technology—in clinical and educational practice |