Literature DB >> 15327677

Assessment of patient management skills and clinical skills of practising doctors using computer-based case simulations and standardised patients.

Richard Hawkins1, Margaret MacKrell Gaglione, Tony LaDuca, Cynthia Leung, Laurel Sample, Gayle Gliva-McConvey, William Liston, André De Champlain, Andrea Ciccone.   

Abstract

CONTEXT: Standardised assessments of practising doctors are receiving growing support, but theoretical and logistical issues pose serious obstacles.
OBJECTIVES: To obtain reference performance levels from experienced doctors on computer-based case simulation (CCS) and standardised patient-based (SP) methods, and to evaluate the utility of these methods in diagnostic assessment. SETTING AND PARTICIPANTS: The study was carried out at a military tertiary care facility and involved 54 residents and credentialed staff from the emergency medicine, general surgery and internal medicine departments. MAIN OUTCOME MEASURES: Doctors completed 8 CCS and 8 SP cases targeted at doctors entering the profession. Standardised patient performances were compared to archived Year 4 medical student data.
RESULTS: While staff doctors and residents performed well on both CCS and SP cases, a wide range of scores was exhibited on all cases. There were no significant differences between the scores of participants from differing specialties or of varying experience. Among participants who completed both CCS and SP testing (n = 44), a moderate positive correlation between CCS and SP checklist scores was observed. There was a negative correlation between doctor experience and SP checklist scores. Whereas the time students spent with SPs varied little with clinical task, doctors appeared to spend more time on communication/counselling cases than on cases involving acute/chronic medical problems.
CONCLUSION: Computer-based case simulations and standardised patient-based assessments may be useful as part of a multimodal programme to evaluate practising doctors. Additional study is needed on SP standard setting and scoring methods. Establishing empirical likelihoods for a range of performances on assessments of this character should receive priority.

Entities:  

Mesh:

Year:  2004        PMID: 15327677     DOI: 10.1111/j.1365-2929.2004.01907.x

Source DB:  PubMed          Journal:  Med Educ        ISSN: 0308-0110            Impact factor:   6.251


  3 in total

1.  Online clinical reasoning assessment with the Script Concordance test: a feasibility study.

Authors:  Louis Sibert; Stefan J Darmoni; Badisse Dahamna; Jacques Weber; Bernard Charlin
Journal:  BMC Med Inform Decis Mak       Date:  2005-06-20       Impact factor: 2.796

2.  Hunter disease eClinic: interactive, computer-assisted, problem-based approach to independent learning about a rare genetic disease.

Authors:  Fatma Al-Jasmi; Laura Moldovan; Joe T R Clarke
Journal:  BMC Med Educ       Date:  2010-10-25       Impact factor: 2.463

Review 3.  Simulation-based assessments in health professional education: a systematic review.

Authors:  Tayne Ryall; Belinda K Judd; Christopher J Gordon
Journal:  J Multidiscip Healthc       Date:  2016-02-22
  3 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.