| Literature DB >> 35449804 |
Enda O'Connor1,2, Evin Doyle1,2.
Abstract
Introduction: Anesthesia and intensive care medicine are relatively new undergraduate medical placements. Both present unique learning opportunities and educational challenges to trainers and medical students. In the context of ongoing advances in medical education assessment and the importance of robust assessment methods, our scoping review sought to describe current research around medical student assessment after anesthesia and intensive care placements.Entities:
Keywords: anesthesia; assessment and education; intensive care; scoping review methodology; undergraduate education best practices
Year: 2022 PMID: 35449804 PMCID: PMC9016165 DOI: 10.3389/fmed.2022.871515
Source DB: PubMed Journal: Front Med (Lausanne) ISSN: 2296-858X
Figure 1PRISMA flow diagram for the scoping review process. (ICM, intensive care medicine).
Summary of studies included in the scoping review.
|
|
|
|
|
|
|
|
|---|---|---|---|---|---|---|
| Critchley et al. ( | An adaptation of the objective structured clinical examination to a final year medical student course in anesthesia and intensive care | Anesthesia | Descriptive account of 4 academic years – observational | 466 students in 4 years. Learning curriculum in place | No student failed exam based on OSCE. Different domains of learning assessed in one examination | Miller's “Shows how” |
| Lofaro et al. ( | An innovative course in surgical critical care for second-year medical students | Academic Medicine | Description of 12-week SICU rotation in 1990/91 | 13 “sophomore” students | Miller's “Knows how” | |
| Moll-Khosrawi et al. ( | Anaesthesiology students' Non-Technical skills: development and evaluation of a behavioral marker system for students (AS-NTS) | BMC Medical Education | 4-steps of literature RV, focus groups/interviews, field observation and implementation/ validation | 98 simulation activities with groups of 3 students (?yr, ?total number) | Assessing 3 NTS: planning tasks, teamwork, and team orientation | Miller's “Shows how” for non-technical skills |
| Shams et al. ( | Assessment of current undergraduate anesthesia course in a Saudi University | Saudi J Anesthesia | Comparison of 3 different Ax tools (MCQs, portfolios and OSCEs) | 154 students on 5-week OR/Anesthesia rotation described | Strong correlation between 3 Ax tools, and between each Ax tool and the overall outcome of the examination (r coeff >0.85 for all 3 tools). The MCQ exam was the highest predictor of overall exam mark, followed by OSCE and then portfolio. | Miller's “Shows how” |
| Morgan and Hogg ( | Evaluation of medical students' performance using the anesthesia simulator | Medical Education | Pilot study, single arm interventional study, no control | 2 week anesthesia rotation with apprenticeship model in OR | Strong IRR (ICC 0.87) | Miller's “Shows how” in simulated setting correlated with knows and knows how |
| Hamid et al. ( | The lack of construct validity when assessing clinical clerks during their anesthesia rotations | Can J Anesthesia | Observational single arm study | 205 medical students undergoing 2 week anesthesia rotation | No score of 1-4 (5=meets expectations) | Miller's “Does” observed in the workplace |
| Morgan et al. ( | High-fidelity patient simulation: validation of performance checklists | BJA | Quantitative | 135 students | 85% students agreed/strongly agreed the scenarios reflected rotation's learning objectives. | Miller's “Shows how” |
| Rogers et al. ( | Quantifying learning in medical students during a critical care medicine elective: a comparison of 3 evaluation instruments. | CCM | One student cohort with pre-/post- data collection. | 24 medical student volunteers | Written test scores did not correlate with students' ability to perform (Sim) or apply their knowledge (OSCE). | Miller's “Shows how” |
| Skinner et al. ( | The use of computerized learning in intensive care: an evaluation of a new teaching program | Medical Education | RCT of computer-assisted learning in addition to standard learning | 28 students (14 in each of 2 groups) | From similar baseline, CAL showed 2-3 times higher post-test scores | A qualified doctor is “a highly complex blend of knowledge, attitudes and skills”(p53) |
| Sharma and White ( | The use of multi-source feedback in assessing undergraduate students in a general surgery anaesthesiology clerkship | Medical Education | Feasibility study using MSF for student Ax | Uncertain size (“groups of 20-24 students”). | Each student had average of 25 assessments over 6 weeks. | Miller's “Does” |
| Critchley et al. ( | Web-based formative assessment case studies: role in a final year medicine 2-week anesthesia course | Anesthesia and Intensive Care | Quantitative | 2 week anesthesia rotation | Wide variation in FACS usage, time spent on each FACS. | Miller's “Knows how” |
| Morgan et al. ( | Validity and Reliability of undergraduate performance assessments in an anesthesia simulator | CJA | Single arm study of students doing simulation | 140 students | High inter-rater reliability in the faculty assessments | Knowledge vs “hands-on medical management problems” (p230) |
| Rogers et al. ( | Medical Students can learn the basic application, analytic, evaluative and psychomotor skills of critical care medicine | Pre- and Post- elective design using 2 clinical scenarios at 5 OSCE stations (randomized to be pre- or post-) | 1 month CCM elective | 40 students doing elective | Written exams reliable but not valid (if “we are training students to perform”) | |
| Morgan et al. ( | Identification of gaps in the achievement of undergraduate anesthesia educational objectives using high fidelity patient simulation | Anaesth Analges | Single arm | 135 students – 165 scenarios | “Expected performance criteria” of students vs “Critical management items” (p1690). | |
| Leung et al. ( | Evidence of virtual patients as a facilitative learning tool on an anesthesia course | AHSE | Quantitative student scores | VPs used to enhance learning. This is a study of learning, NOT assessment. | VPs improved assessment scores | Miller's “Knows How” |
| Kapur et al. ( | Implementation of a formal medical intensive care unit curriculum for medical students | AJRCCM | Quantitative | 4 week MICU rotation | Knowledge improved (67% → 81%) | Miller's “Knows how” |
| Rogers et al. ( | Teaching medical students complex cognitive skills in the intensive care unit | CCM | Single group | 1 month SICU rotations | Knowledge and application improved | Miller's “Knows How” |
| Ho et al. ( | Developing the eMedical Student (eMS) – a pilot project integrating medical students into the tele-ICU during the COVID-19 Pandemic and beyond | Healthcare | Single group | 5 students | Improved knowledge on MCQ test | Miller's “Knows” |
| Gergen et al. ( | Integrated critical care curriculum for the third-year internal medicine clerkship | MedEdPortal | Single group | 41 3rd year students | Improved knowledge on SAQ test |
ICM, intensive care medicine; OR, operating room; IRR, interrater reliability; MCCQE, Medical Council of Canada Qualifying Examination; Sim, simulation; OSCE, objective structured clinical examination; MCQ, multiple choice question; RCT, randomised controlled trial; MSF, multisource feedback; FACS, formative assessment case studies; VPs, virtual patients; MICU, medical intensive care unit; SICU, surgical intensive care unit; SAQ, short-answer question; EM, emergency medicine.
Assessment tools used following clinical placements in Anaesthesia and Intensive Care Medicine.
MCQs, multiple choice questions, OSCEs, objective structured clinical examinations; FACS, formative assessment case studies; MSF, multisource feedback; SAQs, short answer questions.