Literature DB >> 32856725

Virtually competent: remote delivery of clinical competency examinations.

Alexander L Lee1, Brent Fung2, Bradley S Henson3, Hubert K Chan4.   

Abstract

Entities:  

Year:  2020        PMID: 32856725      PMCID: PMC7460926          DOI: 10.1002/jdd.12405

Source DB:  PubMed          Journal:  J Dent Educ        ISSN: 0022-0337            Impact factor:   2.313


× No keyword cloud information.

PROBLEM

At Western University of Health Sciences College of Dental Medicine (WUCDM), D3 and D4 students—after having displayed aptitude in a dental discipline—demonstrate their knowledge and practical skills in mandatory, high‐stakes, capstone competency events called Independent Patient Clinical Exams (IPCEs). IPCEs are supervised by 2 clinical faculty members, assessed on live patients through 5‐point rubrics, and must be passed prior to graduation. With onsite educational activities halted due to COVID‐19, WUCDM faced the task of remotely delivering equivalent, but not identical, Simulated Independent Patient Clinical Exams (SIPCEs) to replace the IPCEs for D4s.

SOLUTION

Remotely delivering IPCEs required web conferencing software for faculty and D4s. WUCDM evaluated 2 conferencing platforms: Zoom and Microsoft Teams. Since protected health information (PHI) would be shared, the platform required Health Insurance Portability and Accountability Act (HIPAA) compliance. Because WUCDM had a Business Associates Agreement with Microsoft outlining that PHI could be stored, Teams was selected. The next task was equalizing the remote and live experiences. SIPCE questions required correlation to IPCE rubrics, cases used had to be varied to reflect clinical reality, clinical skills needed assessment, and this updated process required documentation. First, clinical faculty mapped in‐person workflows to the IPCE rubrics. For example, WUCDM's start check process ensures student review of patient medical and dental histories; the IPCE rubrics assess medical and dental history review, so a start check would be correlated to the medical and dental history review criteria. Next, SIPCEs required case variability. 3D models from CAD/CAM databases were exported to create detailed media (Figure 1). Biomedical faculty generated case‐based content focusing on medical knowledge and provided adjunctive literature. Group practice and specialist faculty wrote questions to target critical thinking in clinical procedures. Examiners pulled from this pool of media, cases, and questions. A script was created so examiners would supply similar experiences for exam takers (Figure 2). This pool of content, coupled by the examiners recording each session and requesting students to sweep the room with their web cameras, mitigated academic dishonesty concerns.
FIGURE 1

Sample media developed by faculty for SIPCE utilizing 3D models from CAD/CAM databases

FIGURE 2

Screenshot of SIPCE briefing which is completed by faculty and distributed to students 24 hours prior to examination

Sample media developed by faculty for SIPCE utilizing 3D models from CAD/CAM databases Screenshot of SIPCE briefing which is completed by faculty and distributed to students 24 hours prior to examination For IPCEs, supervising group faculty vet student clinical readiness by evaluating a portfolio of metrics including procedures completed, qualitative formative feedback over time, self‐assessments, previous academic accomplishments, and didactic course performance. Faculty disqualify students needing more experience, and this exact process was used to ascertain SIPCE readiness. , During SIPCEs, faculty derived direct patient care competence from the students’ historical metrics, performance on case‐based questions aligned to IPCE rubrics, and inquiries regarding treatment rendered in their patient families. The SIPCEs were incorporated into WUCDM's Commission on Dental Accreditation (CODA) application for program change to distance education and included as an addendum to the course syllabus (Figure 3).
FIGURE 3

Side‐by‐side comparison of IPCE and SIPCE experiences

Side‐by‐side comparison of IPCE and SIPCE experiences

RESULTS

Twenty‐eight D4 students participated in oral diagnostics, prosthodontics, periodontics, and restorative dentistry SIPCEs. Scores ranged from 77.43% to 95%. No comments regarding SIPCEs were found in course evaluations. Faculty observed that technical considerations—displaying content, setting up Teams, establishing remote connections—were chief concerns. For future SIPCEs, additional technical support and statistical validation of the method need to be considered.
  3 in total

1.  Can the results of the OSCE predict the results of clinical assessment in dental education?

Authors:  R Näpänkangas; T Karaharju-Suvanto; E Pyörälä; V Harila; P Ollila; R Lähdesmäki; S Lahti
Journal:  Eur J Dent Educ       Date:  2014-12-03       Impact factor: 2.355

2.  Implementing an Objective Structured Clinical Examination (OSCE) in dental education: effects on students' learning strategies.

Authors:  M E Schoonheim-Klein; L L M H Habets; I H A Aartman; C P van der Vleuten; J Hoogstraten; U van der Velden
Journal:  Eur J Dent Educ       Date:  2006-11       Impact factor: 2.355

3.  Virtually competent: remote delivery of clinical competency examinations.

Authors:  Alexander L Lee; Brent Fung; Bradley S Henson; Hubert K Chan
Journal:  J Dent Educ       Date:  2020-08-28       Impact factor: 2.313

  3 in total
  1 in total

1.  Virtually competent: remote delivery of clinical competency examinations.

Authors:  Alexander L Lee; Brent Fung; Bradley S Henson; Hubert K Chan
Journal:  J Dent Educ       Date:  2020-08-28       Impact factor: 2.313

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.