Literature DB >> 27651056

An Innovative, No-cost, Evidence-Based Smartphone Platform for Resident Evaluation.

John M Green1.   

Abstract

PURPOSE: Timely performance evaluation and feedback are critical to resident development. However, formulating and delivering this information disrupts physician workflow, leading to low participation. This study was designed to determine if a locally developed smartphone platform would integrate regular evaluation into daily processes and thus increase faculty participation in timely resident evaluation.
METHODS: Formal, documented resident operative and patient interaction evaluations were compiled over an 8-month study period. The study was divided into two 4-month phases. No changes to the existing evaluation methods were made during Phase 1. Phase 2 began after a washout period of 2 weeks and coincided with the launch of a smartphone-based platform. The platform uses a combination of Likert scale questions and the Dreyfus model of skill acquisition to describe competence levels in technical and nontechnical skills. The instrument inflicts minimal effect on surgeon workflow, with the aim of integrating resident evaluation into daily processes. The number of different faculty members performing evaluations, resident level (postgraduate year), type of interaction or procedure, and competency data were compiled. All evaluations were tracked by the program director as they were automatically uploaded into a database. Faculty members were introduced to the new platform at the beginning of Phase 2, and previous methods of evaluation continued to be encouraged and were considered valid throughout both phases of the study. Data were analyzed using Fisher exact test for specific PGY level, and chi-square test was used for overall program analysis. Statistical significance was set at p < 0.05.
RESULTS: Total faculty engagement, that is, number of faculty members completing evaluations, increased from 13% (5/38) in Phase 1 to 53% (20/38) in Phase 2. During Phase 1, all evaluations consisted of online forms through the department's established system or e-mails to the program director. Evaluations were completed in 0.9% (15/1599) of cases residents completed in Phase 1 versus 12% (217/1812) of those in Phase 2. During Phase 2, evaluations were conducted exclusively using the new platform. This was done based on participant's choice. Total numbers of residents and core faculty members did not change between Phases 1 and 2.
CONCLUSIONS: A smartphone-based platform can be created with existing technology at no cost. It is adaptable and can be updated in real-time and can employ validated scales to build an evaluation portfolio for learners assessing technical and nontechnical skills. Furthermore, and perhaps most importantly, it can be designed to integrate into existing workflow patterns to increase faculty participation. Copyright Â
© 2016 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

Entities:  

Keywords:  Interpersonal and Communication Skills; Medical Knowledge; Practice-Based Learning and Improvement; Professionalism; competency; evaluation; feedback; non-technical skills; smartphone

Mesh:

Year:  2016        PMID: 27651056     DOI: 10.1016/j.jsurg.2016.07.016

Source DB:  PubMed          Journal:  J Surg Educ        ISSN: 1878-7452            Impact factor:   2.891


  1 in total

1.  Point-of-Encounter Assessment: Using Health Belief Model Constructs to Change Grading Behaviors.

Authors:  Susan F McLean; Maureen Francis; Naomi L Lacy; Andres Alvarado
Journal:  J Med Educ Curric Dev       Date:  2019-04-30
  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.