Literature DB >> 12431933

Early assessment of medical students' clinical skills.

Gregory Makoul1, Michael Altman.   

Abstract

OBJECTIVE: To provide first-year medical students with a non-threatening, standardized, clinical-skills assessment at the end of the first year (M1CSA), and provide resources to help them build on their strengths and address any weaknesses before starting the second year. DESCRIPTION: Implemented for the first time in April 2001, the M1CSA was designed to help first-year students identify strengths and weaknesses in their developing repertoire of clinical skills. It was composed of five stations (abdominal exam, cardiovascular exam, lung exam, informed consent, social history) and a self-assessment. Each station involved working with a standardized patient (SP), and all SP encounters were videotaped so students could review their work during the self-assessment. The M1CSA was truly formative in nature; there was no passing or failing mark. To help students prepare for the abdominal, cardiovascular, and lung examinations, we distributed via Blackboard(R) an electronic copy of the same exam-skills checklists that the SPs would be using in the exam stations. To facilitate preparation for the social history and informed consent stations, we used Blackboard to provide the scenarios in advance. While students were not allowed to bring these checklists or scenario descriptions into the stations, they did know exactly what was expected. Immediately after each student encounter, SPs completed skills checklists and entered evaluative comments via computer workstations. Upon completing all five stations, students were provided with printouts of the checklists and comments, and proceeded to review their videotapes and write self-assessments indicating specific areas for improvement. Faculty members were available on-site to review portions of the videotapes, answer questions, and offer suggestions. Students also received a resource sheet, which listed faculty members and clinics where they could obtain help developing their clinical skills over the summer months. The M1CSA was designed by a clinical-skills assessment committee and refined through discussion with students. The committee was composed of all four of the first-and second-year clinical skills unit directors, as well as the Patient, Physician & Society course coordinator, the associate dean for medical informatics and computer-assisted learning, the two staff members of Northwestern's Clinical Education and Evaluation Center, the office of medical education's director of evaluation, and the associate dean for education (ex-officio). Students evaluated this first iteration of the M1CSA very positively, and provided useful feedback for making specific improvements. DISCUSSION: The primary motivation for implementing the M1CSA was the fact that, over the past several years, many students reported starting the second year with considerable uncertainty about their clinical skills. The M1CSA was designed to offer a standardized inventory of students' proficiencies, and to encourage continued development in anticipation of working with real patients in the second year. The content reflects a broad view of clinical skills, encompassing communication and ethics, as well as the physical examination. The formative nature of the M1CSA complements the long-standing M2CSA, which is summative (i.e., students must demonstrate proficiency before beginning clerkships).

Entities:  

Mesh:

Year:  2002        PMID: 12431933     DOI: 10.1097/00001888-200211000-00020

Source DB:  PubMed          Journal:  Acad Med        ISSN: 1040-2446            Impact factor:   6.893


  6 in total

1.  Is the use of videotape recording superior to verbal feedback alone in the teaching of clinical skills?

Authors:  Nilgun Ozcakar; Vildan Mevsim; Dilek Guldal; Tolga Gunvar; Ediz Yildirim; Zafer Sisli; Ilgi Semin
Journal:  BMC Public Health       Date:  2009-12-19       Impact factor: 3.295

2.  Clinical capabilities of graduates of an outcomes-based integrated medical program.

Authors:  Helen A Scicluna; Michael C Grimm; Anthony J O'Sullivan; Peter Harris; Louis S Pilotto; Philip D Jones; H Patrick McNeil
Journal:  BMC Med Educ       Date:  2012-06-11       Impact factor: 2.463

3.  How do United Kingdom (UK) medical schools identify and support undergraduate medical students who 'fail' communication assessments? A national survey.

Authors:  Connie Wiskin; Eva M Doherty; Martin von Fragstein; Anita Laidlaw; Helen Salisbury
Journal:  BMC Med Educ       Date:  2013-07-08       Impact factor: 2.463

4.  The use of standardized patients for mock oral board exams in neurology: a pilot study.

Authors:  Brett Kissela; Steven Harris; Dawn Kleindorfer; Christopher Lindsell; Robert Pascuzzi; Daniel Woo; Jerzy Szaflarski; Daniel Kanter; Alex Schneider; Michael Sostok; Joseph Broderick
Journal:  BMC Med Educ       Date:  2006-04-25       Impact factor: 2.463

5.  Developing, conducting and evaluating the internship preparatory program (Ipp).

Authors:  Abeer S Al Shahrani; Samah F Ibrahim; Norah M AlZamil; Eman S Soliman; Lamya A Almusharraf; Amel A Fayed; Noreen Mirza
Journal:  Ann Med Surg (Lond)       Date:  2022-01-01

6.  Is video review of patient encounters an effective tool for medical student learning? A review of the literature.

Authors:  Maya M Hammoud; Helen K Morgan; Mary E Edwards; Jennifer A Lyon; Casey White
Journal:  Adv Med Educ Pract       Date:  2012-03-22
  6 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.