| Literature DB >> 34249546 |
Susanti R Ie1, Jessica L Ratcliffe1, Catalina Rubio2, Kermit S Zhang3, Katherine Shaver4, David W Musick5.
Abstract
BACKGROUND: Finding the ideal candidate for a residency/fellowship program has always been difficult. Finding the "perfect" match has always been the ultimate goal. However, many factors affect obtaining that "perfect" match. In the past, we would have each attending physician review around 20 to 50 Electronic Residency Application Service (ERAS) applications and rank them into three categories: high, middle, or low. Depending on their ranking, the applicant would be invited for an interview. After the interview, the applicants' files (ERAS and interview) would be reviewed and ranked by the faculty as a group. This was time-consuming and fraught with too much subjectivity and minimal objectivity. We, therefore, sought to find a way to assess and rank applicants in a more objective and less time-consuming manner. By creating a customizable scoring tool, we were able to screen applicants to our pulmonary/critical care fellowship program in an efficient and a more objective manner.Entities:
Keywords: assessment in health professions education; education; education and training of medical students and doctors (specialist and phd)); fellowship selection; medical education & training; pulmonary critical care; recruitment
Year: 2021 PMID: 34249546 PMCID: PMC8253231 DOI: 10.7759/cureus.15396
Source DB: PubMed Journal: Cureus ISSN: 2168-8184
Figure 1Pulmonary and Critical Care Fellowship Application Scoring Tool
This is our system that we utilized to score the different elements found in an applicant’s ERAS file. Applicants are given a score from 1 to 5 for each section.
Institutional Pulmonary and Critical Care Fellowship Scoring Rubric for Interview Consideration
This is our scoring rubric that we used to match our fellows in 2018. Each component has a weighted score. Notice that the board exams in our figure demonstrate an increase in value with each subsequent examination due to our program's belief that the later exams test more relevant clinical skills and thought processes relevant to our discipline.
| Name of applicant | |||
| Medical school and program graduation year | |||
| Residency program | |||
| US citizen or VISA | |||
| Name of reviewer | |||
| Criteria | Score (min 1–max 5) | Weight (100%) | Weighted score (score × weight) |
| USMLE or COMLEX scores | 0.15 | ||
| Step 1 | 0.03 | ||
| Step 2 | 0.05 | ||
| Step 3 | 0.07 | ||
| Letters of recommendations | 0.1 | ||
| Research projects | 0.075 | ||
| Publications | 0.15 | ||
| Personal statement | 0.025 | ||
| Chief resident/leadership | 0.1 | ||
| Quality of medical school | 0.15 | ||
| Quality of residency program | 0.15 | ||
| Other accomplishments/degrees, i.e., MBA, MPH | 0.1 | ||
| Cumulative raw score (max 60) | 0 | 1 | |
| Cumulative weighted score (max 5) |
Sample New Rubric Rank List In Comparison to Submitted Rank List
An example of the New Rubric Rank list (ERAS score + interview score) compared to the final rank list that is submitted to the NMRP.
| ERAS Score | Interview Score | Total Score (40% ERAS Score + 60% Interview Score) | New Rubric Rank List | Final Rank List |
| 4.08 | 5 | 4.63 | 1 | 1 |
| 3.6 | 5 | 4.44 | 2 | 2 |
| 3.78 | 4.75 | 4.36 | 3 | 3 |
| 3.31 | 4.75 | 4.17 | 6 | 4 |
| 3.87 | 4.63 | 4.32 | 4 | 5 |
| 2.83 | 4.4 | 3.77 | 26 | 6 |
| 3.53 | 4.5 | 4.11 | 7 | 7 |
| 3.64 | 4.38 | 4.08 | 8 | 8 |
| 3.82 | 4.25 | 4.08 | 9 | 9 |
| 3.61 | 4.38 | 4.07 | 10 | 10 |
Figure 2Comparison Between Applicant Scoring Rubrics in Correlation with Final NRMP Ranking
Correlation of the old and new scoring tools used by our program with the final rank list that was submitted to the NRMP between 2013 and 2018.