Alan Schwartz1, Dorene F Balmer2, Emily Borman-Shoap3, Alan Chin4, Duncan Henry5, Bruce E Herman6, Patricia Hobday7, James H Lee8, Sara Multerer9, Ross E Myers10, Keith Ponitz11, Adam Rosenberg12, Jennifer B Soep13, Daniel C West14, Robert Englander15. 1. A. Schwartz is the Michael Reese Endowed Professor of Medical Education and research professor, pediatrics, University of Illinois College of Medicine, and network director, Association of Pediatric Program Directors (APPD) Longitudinal Educational Assessment Research Network (LEARN), Chicago, Illinois. 2. D.F. Balmer is associate professor, pediatrics, The Children's Hospital of Pennsylvania and University of Pennsylvania, Philadelphia, Pennsylvania. 3. E. Borman-Shoap is vice chair of education, pediatric residency program director, and assistant professor, pediatrics, University of Minnesota, Minneapolis, Minnesota. 4. A. Chin is assistant clinical professor, pediatrics, University of California, Los Angeles, Mattel Children's Hospital, Los Angeles, California. 5. D. Henry is associate program director for assessment, and clinical competency committee chair, pediatric residency, University of California, San Francisco, San Francisco, California. 6. B.E. Herman is professor and vice chair of education and residency programs, pediatrics, University of Utah School of Medicine, Salt Lake City, Utah. 7. P. Hobday is assistant professor, pediatrics, University of Minnesota Medical School, Minneapolis, Minnesota. 8. J.H. Lee is associate director, pediatrics residency program, and assistant professor, pediatrics, the David Geffen School of Medicine at University of California, Los Angeles, Los Angeles, California. 9. S. Multerer is director, pediatric residency program, and associate professor, pediatric hospital medicine, University of Louisville Department of Pediatrics and Norton Children's Hospital, Louisville, Kentucky. 10. R.E. Myers is associate director, pediatric residency program, and associate professor, pediatrics, Rainbow Babies and Children's Hospital and Case Western Reserve University School of Medicine, Cleveland, Ohio. 11. K. Ponitz is director, pediatric residency training program, pediatrics, Rainbow Babies and Children's Hospital, Cleveland, Ohio. 12. A. Rosenberg is professor of pediatrics and program director, pediatric residency, University of Colorado School of Medicine, Children's Hospital Colorado, Aurora, Colorado. 13. J.B. Soep is associate professor, pediatrics, University of Colorado, Aurora, Colorado. 14. D.C. West is professor and associate chair for education, pediatrics, Children's Hospital of Philadelphia and Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania. 15. R. Englander is associate dean, undergraduate medical education, and professor, pediatrics, University of Minnesota Medical School, Minneapolis, Minnesota.
Abstract
PURPOSE: To evaluate response process validity evidence for clinical competency committee (CCC) assessments of first-year residents on a subset of General Pediatrics Entrustable Professional Activities (EPAs) and milestones in the context of a national pilot of competency-based, time-variable (CBTV) advancement from undergraduate to graduate medical education. METHOD: Assessments of 2 EPAs and 8 milestones made by the trainees' actual CCCs and 2 different blinded "virtual" CCCs for 48 first-year pediatrics residents at 4 residency programs between 2016 and 2018 were compared. Residents had 3 different training paths from medical school to residency: time-variable graduation at the same institution as their residency, time-fixed graduation at the same institution, or time-fixed graduation from a different institution. Assessments were compared using ordinal mixed-effects models. RESULTS: Actual CCCs assigned residents higher scores than virtual CCCs on milestones and one EPA's supervision levels. Residents who graduated from a different institution than their residency received lower milestone ratings than either group from the same institution; CBTV residents received higher ratings on one milestone (ICS4) and similar ratings on all others compared with non-CBTV residents who completed medical school at the same institution. CONCLUSIONS: First-year residents who graduated from CBTV medical school programs were assessed as having the same level of competence as residents who graduated from traditional medical school programs, but response process evidence suggests that members of CCCs may also draw on undocumented personal knowledge of the learner to draw conclusions about resident competence.
PURPOSE: To evaluate response process validity evidence for clinical competency committee (CCC) assessments of first-year residents on a subset of General Pediatrics Entrustable Professional Activities (EPAs) and milestones in the context of a national pilot of competency-based, time-variable (CBTV) advancement from undergraduate to graduate medical education. METHOD: Assessments of 2 EPAs and 8 milestones made by the trainees' actual CCCs and 2 different blinded "virtual" CCCs for 48 first-year pediatrics residents at 4 residency programs between 2016 and 2018 were compared. Residents had 3 different training paths from medical school to residency: time-variable graduation at the same institution as their residency, time-fixed graduation at the same institution, or time-fixed graduation from a different institution. Assessments were compared using ordinal mixed-effects models. RESULTS: Actual CCCs assigned residents higher scores than virtual CCCs on milestones and one EPA's supervision levels. Residents who graduated from a different institution than their residency received lower milestone ratings than either group from the same institution; CBTV residents received higher ratings on one milestone (ICS4) and similar ratings on all others compared with non-CBTV residents who completed medical school at the same institution. CONCLUSIONS: First-year residents who graduated from CBTV medical school programs were assessed as having the same level of competence as residents who graduated from traditional medical school programs, but response process evidence suggests that members of CCCs may also draw on undocumented personal knowledge of the learner to draw conclusions about resident competence.
Authors: David R Brown; Jeremy J Moeller; Douglas Grbic; Dorothy A Andriole; William B Cutrer; Vivian T Obeso; Mark D Hormann; Jonathan M Amiel Journal: JAMA Netw Open Date: 2022-09-01