| Literature DB >> 25690440 |
Abstract
The Australasian College for Emergency Medicine (ACEM) will introduce high stakes simulation-based summative assessment in the form of Objective Structured Clinical Examinations (OSCEs) into the Fellowship Examination from 2015. Miller's model emphasises that, no matter how realistic the simulation, it is still a simulation and examinees do not necessarily behave as in real life. OSCEs are suitable for assessing the CanMEDS domains of Medical Expert, Communicator, Collaborator and Manager. However, the need to validate the OSCE is emphasised by conflicting evidence on correlation with long-term faculty assessments, between essential actions checklists and global assessment scores and variable interrater reliability within individual OSCE stations and for crisis resource management skills. Although OSCEs can be a valid, reliable and acceptable assessment tool, the onus is on the examining body to ensure construct validity and high interrater reliability.Entities:
Keywords: educational measurement; patient simulation
Mesh:
Year: 2015 PMID: 25690440 PMCID: PMC4415593 DOI: 10.1111/1742-6723.12370
Source DB: PubMed Journal: Emerg Med Australas ISSN: 1742-6723 Impact factor: 2.151
Figure 1Miller's prism of clinical competence (aka Miller's pyramid). Based on the work by Miller GE, The Assessment of Clinical Skills/Competence/Performance; Acad. Med. 1990; 65(9): 63–67. Adapted by Drs. R. Mehay and R. Burns, UK (Jan 2009).
Criteria for scoring simulations3
| Criteria | Example |
|---|---|
| Explicit Process | Case-specific checklist used in a standardised patient chest-pain station to record the history findings obtained and physical examination manoeuvres performed by an examinee |
| Implicit Process | Global judgment of a physician-rater observing an examinee's work with an integrated simulator in a trauma-type scenario |
| Explicit Outcome | Indicators of overall patient status (alive vs dead; complications; physiological indicators) at the conclusion of a computer-based clinical simulation |
| Implicit Outcome | Global judgment of a physician-rater inspecting the sutures made by an examinee on a skin pad |
| Combined Criteria | Task-specific checklist of explicit process and outcome criteria for observation and inspection of an end-to-end anastomosis of pig bowel |
Identified barriers to using simulation as an assessment tool4
| Costs and logistics |
| Standardisation across multiple simulation sites |
| Exposure of simulation modalities to trainees before high-stakes testing |
| Overreliance on psychometric criteria that can lead to measures (e.g. checklists) that may fail to capture the complexities involved in healthcare, such as caring for the patient with multiple comorbidities |
| Validity, especially in maintenance of licensure and certification where little evidence exists |
| Transferability to actual clinical practice. Training and recruitment of the raters for high-stakes simulation-based assessment |
| Evidence base for some simulation based activities not yet robust enough for high-stakes assessment |