F Jacob Seagull1, Janet E Bailey2, Andrew Trout3, Richard H Cohan2, Monica L Lypson4. 1. Department of Medical Education, University of Michigan Medical School, 221 Victor Vaughan Building, 1111 E. Catherine SPC-2054, Ann Arbor, MI 48109-2054. Electronic address: jseagull@umich.edu. 2. Department of Radiology, University of Michigan Medical School, Ann Arbor, MI. 3. Department of Radiology, Cincinnati Children's Hospital Medical Center, Cincinnati, Ohio. 4. Department of Medical Education, University of Michigan Medical School, 221 Victor Vaughan Building, 1111 E. Catherine SPC-2054, Ann Arbor, MI 48109-2054; Department of Internal Medicine, University of Michigan Medical School, Ann Arbor, Michigan.
Abstract
RATIONALE AND OBJECTIVES: Despite increasing radiology coverage, nonradiology residents continue to preliminarily interpret basic radiologic studies independently, yet their ability to do so accurately is not routinely assessed. MATERIALS AND METHODS: An online test of basic radiologic image interpretation was developed through an iterative process. Educational objectives were established, then questions and images were gathered to create an assessment. The test was administered online to first-year interns (postgraduate year [PGY] 1) from 14 different specialties, as well as a sample of third- and fourth-year radiology residents (PGY3/R2 and PGY4/R3). RESULTS: Over a 2-year period, 368 residents were assessed, including PGY1 (n = 349), PGY3/R2 (n = 14), and PGY4/R3 (n = 5) residents. Overall, the test discriminated effectively between interns (average score = 66%) and advanced residents (R2 = 86%, R3 = 89%; P < .05). Item analysis indicated discrimination indices ranging from -0.72 to 48.3 (mean = 3.12, median 0.58) for individual questions, including four questions with negative discrimination indices. After removal of the negatively indexed questions, the overall predictive value of the instrument persisted and discrimination indices increased for all but one of the remaining questions (range 0.027-70.8, mean 5.76, median 0.94). CONCLUSIONS: Validation of an initial iteration of an assessment of basic image-interpretation skills led to revisions that improved the test. The results offer a specific test of radiologic reading skills with validation evidence for residents. More generally, results demonstrate a principled approach to test development.
RATIONALE AND OBJECTIVES: Despite increasing radiology coverage, nonradiology residents continue to preliminarily interpret basic radiologic studies independently, yet their ability to do so accurately is not routinely assessed. MATERIALS AND METHODS: An online test of basic radiologic image interpretation was developed through an iterative process. Educational objectives were established, then questions and images were gathered to create an assessment. The test was administered online to first-year interns (postgraduate year [PGY] 1) from 14 different specialties, as well as a sample of third- and fourth-year radiology residents (PGY3/R2 and PGY4/R3). RESULTS: Over a 2-year period, 368 residents were assessed, including PGY1 (n = 349), PGY3/R2 (n = 14), and PGY4/R3 (n = 5) residents. Overall, the test discriminated effectively between interns (average score = 66%) and advanced residents (R2 = 86%, R3 = 89%; P < .05). Item analysis indicated discrimination indices ranging from -0.72 to 48.3 (mean = 3.12, median 0.58) for individual questions, including four questions with negative discrimination indices. After removal of the negatively indexed questions, the overall predictive value of the instrument persisted and discrimination indices increased for all but one of the remaining questions (range 0.027-70.8, mean 5.76, median 0.94). CONCLUSIONS: Validation of an initial iteration of an assessment of basic image-interpretation skills led to revisions that improved the test. The results offer a specific test of radiologic reading skills with validation evidence for residents. More generally, results demonstrate a principled approach to test development.
Authors: Kelly C Nelson; Ashley E Brown; Amanda Herrmann; Chloe Dorsey; Julie M Simon; Janice M Wilson; Stephanie A Savory; Lauren E Haydu Journal: Dermatol Pract Concept Date: 2020-10-26