Literature DB >> 35859946

Entrustable Professional Activities for Chinese Standardized Residency Training in Pediatric Intensive Care Medicine.

Zhang Yun1, Liu Jing2, Chen Junfei3, Zhang Wenjing4, Wu Jinxiang5, Yue Tong6, Zhang Aijun1.   

Abstract

Background: Entrustable professional activities (EPAs) were first introduced by Olle ten Cate in 2005. Since then, hundreds of applications in medical research have been reported worldwide. However, few studies discuss the use of EPAs for residency training in pediatric intensive care medicine. We conducted a pilot study of EPA for pediatric intensive care medicine to evaluate the use of EPAs in this subspecialty. Materials and
Methods: A cross-sectional study was implemented in pediatric intensive care medicine standardized residency training at the Qilu Hospital of Shandong University. An electronic survey assessing EPA performance using eight scales composed of 15 categories were distributed among residents and directors.
Results: A total of 217 director-assessment and 44 residents' self-assessment questionnaires were collected, both demonstrating a rising trend in scores across postgraduate years. There were significant differences in PGY1-vs.-PGY2 and PGY1-vs.-PGY3 director-assessment scores, while there were no differences in PGY2-vs.-PGY3 scores. PGY had a significant effect on the score of each EPA, while position significantly affected the scores of all EPAs except for EPA1 (Admit a patient) and EPA2 (Select and interpret auxiliary examinations). Gender only significantly affected the scores of EPA6 (Report a case), EPA12 (Perform health education), and EPA13 (Inform bad news).
Conclusion: This study indicates that EPA assessments have a certain discriminating capability among different PGYs in Chinese standardized residency training in pediatric intensive care medicine. Postgraduate year, gender, and resident position affected EPA scores to a certain extent. Given the inconsistency between resident-assessed and director-assessed scores, an improved feedback program is needed in the future.
Copyright © 2022 Yun, Jing, Junfei, Wenjing, Jinxiang, Tong and Aijun.

Entities:  

Keywords:  Chinese; assessment and education; entrustable professional activities (EPA); pediatric intensive care medicine; standardized residency training (SRT)

Year:  2022        PMID: 35859946      PMCID: PMC9289143          DOI: 10.3389/fped.2022.919481

Source DB:  PubMed          Journal:  Front Pediatr        ISSN: 2296-2360            Impact factor:   3.569


Introduction

Entrustable professional activities (EPAs) were formally conceptualized in 2005 by Olle ten Cate, who defined EPAs as “units of professional practice, defined as tasks or responsibilities to be entrusted to the unsupervised execution by a trainee once he or she has attained sufficient specific competence” (1). The focus of competency-based medical education (CBME) in the recent years has been on achieving EPAs, which at present are likely the most widespread approach to CBME worldwide (2–7). It is indispensable that supervising consultants need a valid and reliable assessment tool of a learner’s performance, helping both realize the learner’s real abilities and improve in time. EPAs seem to be the optimal choice. Despite the seemingly universal affinity for EPAs, there is limited empirical evidence for their use in trainee assessment and a paucity of feedback about their clinical implementation. It has become clear over the past decade that various EPA phenotypes exist worldwide (8–10). These phenotypes may vary based on how they define a stage of training or a profession. These differences may also reflect regulatory oversight in different regions, with some having a single regulatory body that facilitates alignment across the continuum and others having different regulatory bodies overseeing different phases of training and practice (7, 11–15). 35 evaluated a formative assessment system based on EPAs in pediatric residency training at the Peking University First Hospital, proposing an EPA system to assess postgraduate medical education (PGME) that was made up of 15 EPA categories on eight scales (16, 17). This highlighted the complementary advantage of EPAs that could be integrated with an ongoing CBME formative assessment program, including mini-clinical-evaluation exercises (Mini-CEX), direct observation of procedural skills (DOPSs), subjective-objective-assessment-plan (SOAP), and 360-degree assessment. Hence, in the last year, we began to push forward an EPA assessment program at the Qilu Hospital in Shandong University based on the CBME system for standardized residency training at the Peking University First Hospital. Entrustable professional activities have been developed and published for a variety of pediatric subspecialties (18–23), as an emerging and practical tool for assessing trainees’ clinical competencies. Hennus et al. (19) reported a nationally modified Delphi study on developing a set of EPAs for Dutch pediatric intensive care medicine fellows. But few works discuss EPAs for residency training in pediatric intensive care medicine. To preliminarily explore the effectiveness of EPAs and deficiencies in residency training, we, therefore, performed a pilot study of 44 residents within the Chinese standardized residency training program in the pediatric intensive care medicine department at the Qilu Hospital of Shandong University and solicited both resident self-assessment and director-assessment of this training model.

Materials and Methods

Setting

Like many other Chinese standardized training residency programs, Qilu Hospital of Shandong University has a CBME evaluation course that spans the resident’s training after graduation from medical school and includes Mini-CEX, DOPS, SOAP, and 360-degree assessment. According to the national guidelines for standardized residency training, every pediatric resident is supposed to rotate through Pediatric Hematology, Urology, Neurology, Respiratory, Neonatology, Angiocardiopathy, Gastroenterology, Outpatient and Emergency, Infectious Diseases, and Child Healthcare subspecialties for at least 3 months within a 3-year training phase. The departmental rotation examination is administered at the end of each subspecialty rotation phase and is composed of all of the aforementioned skill tests and formative assessments. Directors in charge of pediatric intensive care medicine were pediatric intensive care unit physicians well-trained by the national or provincial director course for Chinese standardized training residency program, who obtained qualification certifications from the Chinese Health Commission or Shandong Provincial Health Commission.

Sample

This study enrolled 44 residents who were trained in pediatric intensive care medicine as part of a standardized residency training program from January 2021 to February 2022 at the Qilu Hospital of Shandong University. In total, seven directors in charge of pediatric intensive care medicine over the same time period were also recruited for this study. All the enrolled residents were categorized into postgraduate year 1 (PGY1) to PGY3 according to their seniority. The study was approved by the Qilu Hospital of Shandong University Institutional Review Board.

Procedure

Entrustable professional activity resident self-assessments and director-assessments were used at the end of the pediatric intensive care department rotation to evaluate resident performance and competency from both points of view. An electronic questionnaire composed of EPAs with 15 categories on eight scales was administered to solicit both resident self-assessment and director assessment in addition to the ongoing evaluation program (Mini-CEX, DOPS, SOAP, and 360-degree assessment). The director assessments of each resident were performed by several directors, whereas, the self-assessment of each resident was performed by the resident his/herself. Each questionnaire included general information (director name, resident name, resident gender, seniority, and position such as professional master, entrusted training residents from junior hospitals, residents of permanent staff at the Qilu Hospital of Shandong University, and social training residents) and EPA evaluation. The 15 categories of the EPA evaluation were established using the guidelines of the Peking University First Hospital (Table 1; 6). Based on the previous literature (16), each EPA was set using eight scales (Table 2). All the EPA assessments were performed until participating residents or directors were well-informed about all of the details of this questionnaire. All the questionnaires were conducted electronically via mobile software. Multiple reminders and phone follow-ups by data collection staff were set to ensure all required responses were collected in time. Each enrolled questionnaire result indicates that all of the included questions were completed.
TABLE 1

Entrustable professional activities (EPAs) categories.

NumberCategory
1Admit a patient
2Select and interpret auxiliary examinations
3Diagnose and make the differential diagnosis
4Make therapeutic decision
5Compose medical documents
6Report a case
7Recognize and manage general clinical conditions
8Recognize and manage emergent and critical conditions
9Transfer and hand over a patient
10Perform informed consent
11Perform basic operation
12Perform health education
13Inform bad news
14Perform clinical education
15Manage public health events
TABLE 2

Eight entrustable levels of each entrustable professional activity (EPA).

ScaleDetails
1Cannot perform certain professional activities as a resident under the direct supervision of a superior physician
2Perform certain professional activities with a superior physician together
3Perform certain professional activities under the supervision and guidance of a superior physician
4Perform certain professional activities without the presence of a superior physician; when help is needed, need the presence of a superior physician to recheck all performances.
5Perform certain professional activities without the presence of a superior physician; when help is needed, need the presence of a superior physician to recheck important performances.
6Perform certain professional activities without the presence of the superior physician; when help is needed, need the guidance and recheck of superior physician over the phone.
7Perform certain professional activities without the need for supervision and guidance from a superior physician.
8Can provide supervision and guidance for others in certain professional activities.
Entrustable professional activities (EPAs) categories. Eight entrustable levels of each entrustable professional activity (EPA).

Statistical Analysis

All the questionnaires were administered using the Wenjuanwang APP 2.7.0 (Zhongyan Network Technology Co., Ltd., Shanghai, China). Data collection was performed using Excel (Microsoft, Redwood, WA, United States), and statistical analysis and figure creation were performed using SPSS 23.0.0 (IBM, Armonk, NY, United States). Comparisons between self-assessments and director-assessments for every EPA across different PGYs were statistically analyzed using the Kruskal–Wallis test. A two-sided p < 0.05 was considered statistically significant. Comparisons between self-assessments and director-assessments for every EPA between every two PGY levels were statistically analyzed using the Mann–Whitney U test, with significance defined as a corrected p-value of 0.017 using the Bonferroni correction for three times the Mann–Whitney U test for the same EPA. The effect analysis of PGY, gender, and position on the EPA scores of director assessments was analyzed using the generalized estimated equation (GEE), with p < 0.05 considered statistically significant.

Results

General Information

This study recruited 44 residents (Table 3) and seven directors. The collected results included 44 resident self-assessment questionnaires and 217 director-assessment questionnaires, with a 100% response rate. The number of director-assessment and self-assessment questionnaire results are listed in Table 3. A line graph was created to show the trend in director-assessment and self-assessment EPA scores over progressive PGY levels (Figure 1). A slowly rising trend in director-assessment scores across all the EPA by PGY year was noted, while self-assessment scores showed a non-distinctive trend across different PGYs.
TABLE 3

Characteristics of residents.

CharacteristicsPGY1PGY2PGY3p-value
Number of residents, n (%)8 (18.2%)22 (50.0%)14 (31.2%)
Male, n (%)2 (25.0%)4 (18.2%)2 (14.3%)0.82
Number of director-assessments, mean ± SD4.6 ± 1.54.7 ± 1.25.1 ± 0.50.53
FIGURE 1

Line graph of scores of director-assessment and self-assessment in each entrustable professional activity (EPA). Each point stands for the mean of scores of a certain subgroup, with bars standing for the 95% CI of the mean of each subgroup.

Characteristics of residents. Line graph of scores of director-assessment and self-assessment in each entrustable professional activity (EPA). Each point stands for the mean of scores of a certain subgroup, with bars standing for the 95% CI of the mean of each subgroup.

Comparison of Director-Assessment Scores Across Different Postgraduate Years

Director-assessment EPA scores are listed in Table 4. There were significant differences between the EPA scores across different PGYs. The higher the PGY year that the residents were in, the higher the scores that they got. When univariate PGY years were compared, there were significant differences between PGY1 and PGY2 and between PGY1 and PGY3 (p < 0.017), whereas there were no obvious differences between PGY2 and PGY3 in any EPA category.
TABLE 4

Scores of director-assessment in different postgraduate years (PGYs).

EPAsPGY1PGY2PGY3Chi-square*P-value
EPA15.8 ± 1.3**6.7 ± 0.86.8 ± 0.9***21.2930.000
EPA25.7 ± 1.3**6.6 ± 0.86.8 ± 0.8***22.2560.000
EPA35.8 ± 1.2**6.6 ± 0.86.7 ± 0.7***18.7640.000
EPA45.6 ± 1.2**6.4 ± 0.86.5 ± 0.9***18.4280.000
EPA56.0 ± 1.1**6.8 ± 0.77.1 ± 0.7***26.3200.000
EPA66.0 ± 1.1**6.8 ± 0.76.9 ± 0.8***22.9020.000
EPA75.8 ± 1.3**6.6 ± 0.86.7 ± 1.0***15.7980.000
EPA85.5 ± 1.2**6.1 ± 0.96.3 ± 0.8***13.6140.001
EPA96.1 ± 1.2**6.8 ± 0.86.9 ± 0.8***12.9330.002
EPA106.4 ± 1.2**7.1 ± 0.87.2 ± 0.7***13.5100.001
EPA115.8 ± 1.4**6.7 ± 0.86.8 ± 0.9***17.4530.000
EPA126.2 ± 1.0**6.8 ± 0.77.0 ± 0.6***23.7170.000
EPA136.0 ± 1.0**6.6 ± 0.86.6 ± 0.7***13.9890.001
EPA145.4 ± 1.5**6.3 ± 0.96.2 ± 0.9***8.8810.012
EPA156.1 ± 1.2**6.8 ± 0.96.8 ± 0.8***11.2380.004

*Kruskal–Wallis test. **Mann–Whitney U test revealed a significant difference between PGY1 and PGY2. ***Mann–Whitney U test revealed a significant difference between PGY1 and PGY3.

Scores of director-assessment in different postgraduate years (PGYs). *Kruskal–Wallis test. **Mann–Whitney U test revealed a significant difference between PGY1 and PGY2. ***Mann–Whitney U test revealed a significant difference between PGY1 and PGY3.

Effect Analysis of Postgraduate Year, Gender, and Position on Director-Assessment Scores

Given that resident PGY, gender, and position all could affect director EPA scores (Table 5), a GEE model analysis was performed to analyze the effect of these factors on EPAs score (Table 6). PGY had a significant effect on all EPA scores (p < 0.05), whereas, resident position significantly affected every EPA score except for EPA1 (p = 0.714, >0.05) and EPA2 (p = 0.076, >0.05). Resident gender only significantly affected EPA6 (p = 0.002, <0.05), EPA12 (p = 0.010, <0.05), and EPA13 (p = 0.018, <0.05) (Table 6).
TABLE 5

Categorical variable information of director-assessment questionnaires.

Factor N Percent
PGYPGY13717.1%
PGY210665.9%
PGY37434.1%
GenderFemale17781.6%
Male4018.4%
PositionProfessional master11854.4%
Entrusted training residents8237.8%
Permanent staff83.7%
Social training residents94.1%
TABLE 6

Generalized estimated equation analysis of director-assessment questionnaires.

EPAsFactorTests of model effectsParameterB95% Wald confidence intervalHypothesis test



Ward Chi-squareP-valueLowerUpperWald Chi-squareP-value
EPA1PGY11.8270.003PGY10a
PGY20.7940.2701.3188.8160.003
PGY30.9880.4251.55111.8200.001
Gender1.8140.178Male0a
Female–0.278–0.6840.1271.8140.178
Position1.3630.714Professional master0a
Entrusted training residents0.168–0.1310.4601.1940.275
Permanent staff0.091–0.1640.3460.4860.484
Social training residents0.129–0.1990.4570.5940.441
EPA2PGY18.1580.000PGY10a
PGY20.7120.2141.2117.8550.005
PGY30.9830.4851.48114.9870.000
Gender2.3950.122Male0a
Female–0.274–0.6210.0732.3950.122
Position6.8700.076Professional master0a
Entrusted training residents0.2370.0210.6094.4090.036
Permanent staff0.154–0.0540.3622.1150.146
Social training residents0.3150.0190.4554.5570.033
EPA3PGY20.1280.000PGY10a
PGY20.5410.1430.9397.0920.008
PGY30.8120.4191.20616.3920.000
Gender2.6370.104Male0a
Female–0.247–0.5450.0512.6370.104
Position13.0830.004Professional master0a
Entrusted training residents0.3500.1490.55111.6280.001
Permanent staff0.3340.1300.53810.3250.001
Social training residents0.263–0.0510.5782.6890.101
EPA4PGY15.3470.000PGY10a
PGY20.6950.2381.1528.8870.003
PGY30.8910.4421.33915.1350.000
Gender1.2190.270Male0a
Female–0.211–0.5860.1641.2190.270
Position59.3470.000Professional master0a
Entrusted training residents0.144–0.1270.4151.0880.003
Permanent staff0.042–0.2340.3180.0900.796
Social training residents0.4460.1510.7418.7880.003
EPA5PGY17.8120.000PGY10a
PGY20.6810.2631.10010.1670.001
PGY30.9810.5141.44816.9470.000
Gender2.8990.089Male0a
Female–0.261–0.5610.0392.8990.089
Position192.6210.000Professional master0a
Entrusted training residents0.180–0.0400.4002.5790.108
Permanent staff0.3600.1880.53116.9290.000
Social training residents0.137–0.0380.3112.3560.125
EPA6PGY30.9040.000PGY10a
PGY20.7230.4231.02322.3700.000
PGY30.8840.5671.20029.9400.000
Gender9.6060.002Male0a
Female–0.394–0.644-0.1459.6060.002
Position19.3190.000Professional master0a
Entrusted training residents0.1460.1490.55111.6280.199
Permanent staff0.3810.1850.57714.5250.000
Social training residents0.2790.0470.5115.5680.018
EPA7PGY18.0230.000PGY10a
PGY20.6680.3061.02913.1200.000
PGY30.8020.4311.17317.9280.000
Gender1.8580.173Male0a
Female–0.224–0.5460.0981.8580.173
Position10.1060.018Professional master0a
Entrusted training residents0.2330.0380.4285.4660.019
Permanent staff0.2320.0790.3858.8240.003
Social training residents0.132–0.1380.4030.9180.338
EPA8PGY10.9600.004PGY10a
PGY20.4490.0670.8305.3100.021
PGY30.6400.2561.02310.6830.001
Gender2.1180.146Male0a
Female–0.231–0.5430.0802.1180.146
Position410.2690.000Professional master0a
Entrusted training residents0.2880.0370.5395.0560.025
Permanent staff0.234–0.0020.4703.7810.052
Social training residents0.039–0.1980.2760.1030.748
EPA9PGY13.0640.001PGY10a
PGY20.5360.2020.8699.8930.002
PGY30.6250.2840.96612.9090.000
Gender1.5230.217Male0a
Female–0.190–0.4920.1121.5230.217
Position13.3920.004Professional master0a
Entrusted training residents0.3500.1130.5508.8330.003
Permanent staff0.027–0.1490.2020.0890.765
Social training residents0.178–0.2060.5630.8280.363
EPA10PGY12.4570.002PGY10a
PGY20.5260.1460.9077.3630.007
PGY30.6720.2951.05012.1810.000
Gender3.7030.054Male0a
Female–0.299–0.6040.0063.7030.054
Position12.1140.002Professional master0a
Entrusted training residents0.3700.1490.55111.6280.001
Permanent staff0.4240.2210.62616.8610.000
Social training residents0.168–0.0340.3702.6450.104
EPA11PGY17.5180.000PGY10a
PGY20.7870.3841.19114.6290.000
PGY30.9000.4781.32217.4890.000
Gender0.9220.337Male0a
Female–0.172–0.5230.1790.9220.337
Position11.5410.009Professional master0a
Entrusted training residents0.2770.0780.4767.4660.006
Permanent staff0.2500.0990.40010.6080.001
Social training residents0.242–0.1300.6151.6240.203
EPA12PGY18.0680.000PGY10a
PGY20.4740.1410.8077.7970.005
PGY30.7000.3631.03716.6020.000
Gender6.5760.010Male0a
Female–0.346–0.610-0.0826.5760.010
Position487.2670.000Professional master0a
Entrusted training residents0.3180.1240.51310.2830.001
Permanent staff0.6980.5190.87658.7930.000
Social training residents–0.014–0.2050.1770.0200.888
EPA13PGY10.7420.005PGY10a
PGY20.4280.0890.7686.1180.013
PGY30.5550.2230.88810.7070.001
Gender5.6310.018Male0a
Female–0.463–0.845-0.0815.6310.018
Position102.2850.000Professional master0a
Entrusted training residents0.2770.1490.55111.6280.020
Permanent staff0.8090.1300.53810.3250.000
Social training residents0.128–0.1790.4360.6690.414
EPA14PGY11.8130.003PGY10a
PGY20.7530.3181.18811.5160.001
PGY30.6930.2711.11610.3370.001
Gender0.4860.486Male0a
Female–0.146–0.5560.2640.4860.486
Position26.4810.000Professional master0a
Entrusted training residents0.136–0.6180.4290.1260.278
Permanent staff0.4310.2270.63417.1610.000
Social training residents–0.095–0.0510.5782.6890.723
EPA15PGY13.6820.001PGY10a
PGY20.6950.3271.06313.6790.008
PGY30.6340.2521.01510.6120.000
Gender1.0990.295Male0a
Female–0.191–0.5480.1661.0990.295
Position30.1710.000Professional master0a
Entrusted training residents0.1520.1490.55111.6280.146
Permanent staff–0.235–0.377-0.09310.4940.001
Social training residents–0.311–0.462-0.16116.3980.000

Categorical variable information of director-assessment questionnaires. Generalized estimated equation analysis of director-assessment questionnaires. The scores of all 15 EPA categories rose as PGY grew except for EPA14 (perform clinical education, set PGY1 as zero; PGY2: B = 0.753, p = 0.001, <0.05, PGY3: B = 0.693, p = 0.001, <0.05) and EPA15 (Manage public health events, PGY2: B = 0.695, p = 0.000, <0.05, PGY3: B = 0.634, p = 0.001, <0.05), with higher mean scores for PGY2s than PGY3s and the lowest mean score at PGY1. The mean scores of male residents in EPA6 (Report a case, set male as zero; female: B = −0.394, p = 0.002, <0.05), EPA12 (Perform health education, female: B = −0.346, p = 0.010, <0.05), and EPA13 (Inform bad news, female: B = −0.463, p = 0.018, <0.05) were higher than those of females. Entrusted training residents got the highest scores in EPA3 (Diagnose and make differential diagnosis, set professional master as zero; B = 0.350, p = 0.001, <0.05), EPA7 (Recognize and manage general clinical conditions, B = 0.233, p = 0.019, <0.05), EPA8 (Recognize and manage emergent and critical conditions, B = 0.288, p = 0.025, <0.05), EPA9 (Transfer and hand over a patient, B = 0.332, p = 0.003, <0.05), EPA11(Perform basic operation, B = 0.277, p = 0.006, <0.05), while permanent staff ranked as the top subgroup in EPA5 (Compose medical documents, set professional master as zero; B = 0.360, p = 0.000, <0.05), EPA6 (Report a case, B = 0.381, p = 0.000, <0.05), EPA10 (Perform informed consent, B = 0.424, p = 0.000, <0.05), EPA12 (Perform health education, B = 0.698 p = 0.000, <0.05), EPA13 (Inform bad news, B = 0.809, p = 0.000, <0.05), and EPA14 (Perform clinical education, B = 0.431, p = 0.000, <0.05). Social training residents were the best subgroup in EPA2 (Select and interpret auxiliary examinations, set professional master as zero; B = 0.315, p = 0.036, <0.05) and EPA4 (Make therapeutic decision, B = 0.446, p = 0.003, <0.05) while professional masters performed best in EPA15 (Manage public health events, p < 0.05).

Comparison of Self-Assessment Scales Across Different Postgraduate Years

Self-assessment EPA scores are listed in Table 7. There were significant differences only within EPA2 (Select and interpret auxiliary examinations), EPA3 (Diagnose and make the differential diagnosis), EPA4 (Make therapeutic decision), EPA8 (Recognize and manage emergent and critical conditions), EPA9 (Transfer and hand over a patient), EPA14 (Perform clinical education), and EPA15 (Manage public health events) across the different PGY years, with higher level PGY residents scoring better. There were no obvious differences in the other EPAs across different PGYs. As for the comparisons between the two PGYs, there were significant differences in EPA15 (Manage public health events) scores between PGY1 and PGY2 (p < 0.017), and in EPA3 (Diagnose and make the differential diagnosis), EPA4 (Make therapeutic decision), EPA8 (Recognize and manage emergent and critical conditions), EPA9 (Transfer and hand over a patient), and EPA14 (Perform clinical education) between PGY2 and PGY3 (p < 0.017). Significant differences in EPA3 (Diagnose and make the differential diagnosis), EPA4 (Make therapeutic decision), EPA8 (Recognize and manage emergent and critical conditions), and EPA15 (Manage public health events) were seen between PGY1 and PGY3 (p < 0.017).
TABLE 7

Scores of self-assessment in different postgraduate years (PGYs).

EPAsPGY1PGY2PGY3Chi-square*P-value
EPA14.9 ± 2.05.6 ± 1.56.3 ± 1.04.4740.107
EPA24.3 ± 2.15.2 ± 1.25.9 ± 1.16.4490.040
EPA34.0 ± 2.15.0 ± 1.4**5.9 ± 0.7***9.6100.008
EPA43.3 ± 2.34.4 ± 1.0**5.6 ± 0.9***14.7790.001
EPA55.6 ± 1.85.7 ± 1.46.4 ± 0.92.3670.306
EPA65.3 ± 2.15.4 ± 1.46.2 ± 1.22.7270.256
EPA74.5 ± 2.04.8 ± 1.05.6 ± 1.15.7640.056
EPA83.5 ± 2.04.1 ± 1.2**5.6 ± 0.9***13.5820.001
EPA95.3 ± 1.94.9 ± 1.4**6.1 ± 0.87.3470.025
EPA105.5 ± 2.15.7 ± 1.56.2 ± 1.10.5840.747
EPA115.0 ± 1.95.3 ± 1.36.1 ± 1.04.2080.122
EPA125.6 ± 1.85.7 ± 1.35.9 ± 1.10.0500.976
EPA134.4 ± 2.65.0 ± 1.15.4 ± 0.81.6230.444
EPA143.9 ± 2.64.1 ± 1.2**5.3 ± 1.26.6140.037
EPA152.6 ± 1.6**4.3 ± 1.54.9 ± 1.2***10.8020.005

*Kruskal–Wallis test.

**Mann–Whitney U test revealed a significant difference between PGY2 and PGY3.

***Mann–Whitney U test revealed a significant difference between PGY1 and PGY3.

Scores of self-assessment in different postgraduate years (PGYs). *Kruskal–Wallis test. **Mann–Whitney U test revealed a significant difference between PGY2 and PGY3. ***Mann–Whitney U test revealed a significant difference between PGY1 and PGY3.

Comparison of Entrustable Professional Activities Scores of Self-Assessments Between Genders

There was a significant difference in EPA8 (Recognize and manage emergent and critical conditions, p = 0.019, p < 0.05) between the self-assessment scores of male and female residents, with male residents self-scoring better than females (Figure 2).
FIGURE 2

Error bar chart of self-assessment between genders. The edges of each bar stand for the 95% CI of scores in subgroups. **EPA8: p = 0.019, p < 0.05.

Error bar chart of self-assessment between genders. The edges of each bar stand for the 95% CI of scores in subgroups. **EPA8: p = 0.019, p < 0.05.

Comparisons Between Director and Self-Assessment Scores Across Entrustable Professional Activities Within the Same Postgraduate Year

The director and self-assessment scores of PGY1s were mostly consistent except for EPA2 (Select and interpret auxiliary examinations, p = 0.31, p < 0.05), EPA3 (Diagnose and make a differential diagnosis, p = 0.12, p < 0.05), EPA4 (Make the therapeutic decision, p = 0.03, p < 0.05), EPA7 (Recognize and manage general clinical conditions, p = 0.39, p < 0.05), EPA8 (Recognize and manage emergent and critical conditions, p = 0.002, p < 0.05), and EPA15 (Manage public health events, p = 0.00, p < 0.05), where directors awarded higher scores. There were significant differences between the self-assessment and director-assessment scores for every EPA for PGY2s and PGY3s (PGY2: EPA1 p = 0.001, other EPAs P = 0.000; PGY3: EPA1 p = 0.036, EPA2 P = 0.003, EPA3 P = 0.000, EPA4 P = 0.002, EPA5 P = 0.012, EPA6 P = 0.034, EPA7 P = 0.001, EPA8 P = 0.008, EPA9 P = 0.002, EPA10 P = 0.001, EPA11 P = 0.014, EPA12 P = 0.000, EPA13 P = 0.000, EPA14 P = 0.009, EPA15 P = 0.000), with higher scores awarded by the director-assessment for each EPA (Figure 3).
FIGURE 3

Comparison of director-assessment vs. self-assessment in entrustable professional activities (EPAs) within the same postgraduate year (PGY). The edges of each bar mean the 95% CI of scores of each subgroup in EPAs. The point of each bar in the middle stand for the mean of scores of a certain subgroup.

Comparison of director-assessment vs. self-assessment in entrustable professional activities (EPAs) within the same postgraduate year (PGY). The edges of each bar mean the 95% CI of scores of each subgroup in EPAs. The point of each bar in the middle stand for the mean of scores of a certain subgroup.

Discussion

Since their initial introduction by Olle ten Cate (1) in 2005, EPA have become an important part of CBME in undergraduate and postgraduate medical education settings (17, 19, 24). EPAs are designed to be real-life activities, and as such can be understood and applied more easily than prior concepts within CBME, such as milestones (25). An EPA combines the knowledge, skills, and attitudes necessary to perform a task, incorporating and synthesizing learning objectives into a meaningful unit. EPAs provide a framework to make judgments of trainee ability explicit, which is important at all stages of medical education (26). In their literature search, Kerth et al. (27) reported a notable shift from descriptions of EPA development processes toward aspects beyond development, such as implementation, feasibility, acceptance/perception, and assessment. Of note, there are few studies about EPAs in pediatric postgraduate education, of which most are from general pediatric residencies or other subspecialties, such as pediatric emergency medicine, pediatric cardiology, and neonatology. Furthermore, studies from Asia are scarce. This study focused on the implementation and feasibility of EPAs in Chinese standardized residency training in pediatric intensive care medicine. Our study suggested that the director-assessment scores of residents in pediatric intensive care in every EPAs rose significantly over postgraduate training, with significant differences between PGY1 vs. PGY2 and PGY1 vs. PGY3 but not PGY2 vs. PGY3. These findings were nearly consistent with previous studies that utilized residency training programs (28) and fellows using American Board of Pediatrics subspecialty EPAs (29). However, with respect to self-assessment scores, only a segment of EPA scores were significantly different across PGYs and between individual PGY years. When an effect analysis on PGY, gender, and position on EPA scores was performed, EPA scores rose with PGY except for EPA14 and EPA15 while gender affected every EPA score significantly, with the male residents scoring higher. In contrast, residents in different positions scored better in different EPAs. The male self-assessment scores in “Recognize and managed emergent and critical conditions in pediatric intensive care” were significantly higher than female scores, while other EPAs were equivalent between genders. When self-assessment and director-assessment scores of PGY1s were compared, most of the director-assessed scores were significantly higher than those of the resident self-assessments. Furthermore, all of the director-assessment scores in every EPA category were significantly better than the self-assessment scores of both PGY2s and PGY3s. This was a cross-sectional study in pediatric intensive care that evaluated the implementation and feasibility of EPAs in the formative assessment of the ongoing CBME for standardized residency training. The upward trend in director-assessed scores for each EPA over pediatric intensive care was significant. PGY1 residents are less capable of certain professional activities in pediatric intensive care than PGY2s and PGY3s, while there were no significant differences in any director-assessed score between PGY2 and PGY3 in pediatric intensive care. This obvious change in ability between the PGY1 and PGY2/PGY3 years may be due to the first years of training immediately after graduating, while there would be incremental development during the second or third year due to wide-ranging rotations across all the pediatric subspecialties instead of continuous training within one certain subspecialty. On the other hand, the insufficiency of professional activities during undergraduate education for trainees before standardized training was revealed based on the relatively lower scores of PGY1 residents. As stepped elevation is emphasized in the CBME program, residents are thought to develop their professional skills as their training time increases (30, 31). These areas include all EPAs, suggesting that the curriculum for training residents in these areas requires notable improvement, and directors and regulatory agencies should be encouraged to reinforce the idea up-grading professional skills between the PGY2 and PGY3 years (32, 33). For most of these 15-category EPAs, there was a certain percentage of residents who were able to practice EPAs unsupervised by the end of 3 years of residency training. However, for the remaining group of unqualified residents to be able to practice those EPAs unsupervised by the end of their required training, educators and regulatory agencies would need to implement EPA-based assessments more broadly or efficiently in pediatric subspecialties, as suggested previously (34). If we expect residents to meet the standards for unsupervised practice after training in all 15 EPA categories, either training needs to be enhanced significantly in these areas or our expectations of what residents are required to achieve by the completion of their training need to be adjusted. Future studies should be performed to determine whether similar experiences have been reported in other specialties. Given that PGY, gender, and position could affect EPA score, we used a GEE model to analyze our correlation analysis. EPA scores rose significantly across PGY years except for EPA14 (Perform clinical education) and EPA15 (Manage public health events), with the highest scores noted among PGY2s. This suggests a lack of stepwise training between the 2nd and 3rd year within this standardized training program. After the first postgraduate year of training, individual talents might be distinguishing each resident’s abilities. With respect to the gender gap, the scores for EPA6 (Report a case), EPA12 (Perform health education), and EPA13 (Inform bad news) were significantly higher among the male residents. After interviewing the enrolled directors, the potential advantages in the logical thinking and professional image credibility of male physicians in the daily workplace make this result understandable. There were four kinds of resident positions, which had an effect on EPA score differences. Professional masters had just graduated with their bachelor’s degrees from medical school while permanent staff mostly had doctoral degrees, which required a prolonged research period or more professional knowledge in some fields. Whereas, the entrusted training residents and social training residents were more experienced in clinical work and usually worked for a few years prior to attending standardized residency training, they generally had a lesser educational background. Different backgrounds led to different advantages in professional activities, which can allow us to reinforce the personalized training plan for residents in different positions to play to everyone’s strengths. Resident self-assessment scores were inconsistent with the director’s perception. Residents believed that they had significantly developed only in EPA2 (Select and interpret auxiliary examinations), EPA3 (Diagnose and make the differential diagnosis), EPA4 (Make therapeutic decision), EPA8 (Recognize and manage emergent and critical conditions), EPA9 (Transfer and hand over a patient), EPA14 (Perform clinical education), and EPA15 (Manage public health events) over their 3 years of standardized training. There was a significant difference in self-assessment scores between genders only in EPA8 (Recognize and manage emergent and critical conditions), with males scoring higher. This might come from the male advantage in physical strength and adaptability to a heavy daily workload and the burden of the pediatric intensive care medicine rotation. Of note, there were a limited number of male residents enrolled in this study, which might lead to inconsistencies between director- and self-assessment scores. A further large cohort of residents is required to produce more reliable results. The director-assessment scores were higher than the self-assessment scores of PGY1s in EPA2 (Select and interpret auxiliary examinations), EPA3 (Diagnose and make the differential diagnosis), EPA4 (Make therapeutic decision), EPA7 (Recognize and manage general clinical conditions), EPA8 (Recognize and manage emergent and critical conditions), and EPA15 (Manage public health events). Similar situations were found in the PGY2 and PGY3 years across all EPAs categories. This is likely due to the lack of self-confidence and self-recognition among the residents. It may also indicate the lack of efficient feedback from directors to residents, preventing the trainee’s understanding of how they performed and what they needed to improve. Further efficient feedback on EPAs is required. Our study has several strengths. It reported the implementation and feasibility of EPAs in the Chinese standardized training of pediatric intensive care residents. It established obvious differences in EPA performance between lower PGY and higher PGY residents and provided a well-structured framework to guide residents in the development of the knowledge, skills, and attitudes necessary to perform a task while incorporating and synthesizing learning objectives. We analyzed the effects of PGY, gender, and resident position on EPAs scores, confirming that PGY and gender correlated with EPA scores while resident position had a limited impact. The incongruity between director-assessed and self-assessment scores indicates the need for an efficient feedback program. There are also limitations to our study. First, the sample size is limited, leading to our inability to analyze the reliability and validity of EPA implementation in pediatric intensive care medicine training. This was limited by the capability of resident training at our hospital and the number of directors at our institution. The translation into clinical practice and how these skills affect the patient outcome remains to be determined. Second, this is a cross-sectional study that enrolled residents trained in pediatric intensive care medicine within the last year. There are no detailed outcomes related to clinical practice and patient outcomes measured. Since EPAs were newly integrated into the ongoing CBME program in China, we had limited experience with this. A longitudinal study may be validated, and a multicenter longitudinal study would be of great value. In summary, this study indicates that EPA assessments had a certain discriminating capability between class years of Chinese standardized residency training in pediatric intensive care medicine, with scores rising with PGY year. Postgraduate year, gender, and resident position impacted EPA scores. Given the incongruities between resident-assessed and director-assessed scores, an improved feedback program is needed.

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors, without undue reservation.

Ethics Statement

The studies involving human participants were reviewed and approved by Qilu Hospital of Shandong University Institutional Review Board. The patients/participants provided their written informed consent to participate in this study.

Author Contributions

ZY initiated the study, participated in the design and coordination, did the basic statistical analysis, and drafted the manuscript. LJ did the majority of the statistical analysis. ZA helped to initiate the study and edit the manuscript. CJ, ZW, WJ, and YT helped to collect the original data and did the statistical analysis. All authors read and approved the final manuscript.

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher’s Note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.
  34 in total

1.  Implementing Competency-Based Medical Education in a Postgraduate Family Medicine Residency Training Program: A Stepwise Approach, Facilitating Factors, and Processes or Steps That Would Have Been Helpful.

Authors:  Karen Schultz; Jane Griffiths
Journal:  Acad Med       Date:  2016-05       Impact factor: 6.893

Review 2.  Entrustment Decision Making in Clinical Training.

Authors:  Olle Ten Cate; Danielle Hart; Felix Ankel; Jamiu Busari; Robert Englander; Nicholas Glasgow; Eric Holmboe; William Iobst; Elise Lovell; Linda S Snell; Claire Touchie; Elaine Van Melle; Keith Wycliffe-Jones
Journal:  Acad Med       Date:  2016-02       Impact factor: 6.893

3.  Twelve tips for the implementation of EPAs for assessment and entrustment decisions.

Authors:  Harm Peters; Ylva Holzhausen; Christy Boscardin; Olle Ten Cate; H Carrie Chen
Journal:  Med Teach       Date:  2017-05-26       Impact factor: 3.650

4.  A systematic review of 7 years of research on entrustable professional activities in graduate medical education, 2011-2018.

Authors:  Emily O'Dowd; Sinéad Lydon; Paul O'Connor; Caoimhe Madden; Dara Byrne
Journal:  Med Educ       Date:  2019-01-04       Impact factor: 6.251

5.  Pediatric Residency Education and the Behavioral and Mental Health Crisis: A Call to Action.

Authors:  Julia A McMillan; Marshall Land; Laurel K Leslie
Journal:  Pediatrics       Date:  2017-01       Impact factor: 7.124

6.  Transition from pediatric to adult care: internists' perspectives.

Authors:  Nadja G Peter; Christine M Forke; Kenneth R Ginsburg; Donald F Schwarz
Journal:  Pediatrics       Date:  2009-02       Impact factor: 7.124

7.  Creating Entrustable Professional Activities to Assess Internal Medicine Residents in Training: A Mixed-Methods Approach.

Authors:  David R Taylor; Yoon Soo Park; Christopher A Smith; Jolanta Karpinski; William Coke; Ara Tekian
Journal:  Ann Intern Med       Date:  2018-04-17       Impact factor: 25.391

Review 8.  The Use of Entrustable Professional Activities in Pediatric Postgraduate Medical Education: A Systematic Review.

Authors:  Janna-Lina Kerth; Lena van Treel; Hans Martin Bosse
Journal:  Acad Pediatr       Date:  2021-07-10       Impact factor: 3.107

9.  Longitudinal Assessment of Resident Performance Using Entrustable Professional Activities.

Authors:  Daniel J Schumacher; Daniel C West; Alan Schwartz; Su-Ting Li; Leah Millstein; Elena C Griego; Teri Turner; Bruce E Herman; Robert Englander; Joni Hemond; Valera Hudson; Lauren Newhall; Kenya McNeal Trice; Julie Baughn; Erin Giudice; Hannah Famiglietti; Jonathan Tolentino; Kimberly Gifford; Carol Carraccio
Journal:  JAMA Netw Open       Date:  2020-01-03

10.  Entrustable Professional Activities for Pathology: Recommendations From the College of American Pathologists Graduate Medical Education Committee.

Authors:  Cindy B McCloskey; Ronald E Domen; Richard M Conran; Robert D Hoffman; Miriam D Post; Mark D Brissette; Dita A Gratzinger; Patricia M Raciti; David A Cohen; Cory A Roberts; Amyn M Rojiani; Christina S Kong; Jo Elle G Peterson; Kristen Johnson; Sue Plath; Suzanne Zein-Eldin Powell
Journal:  Acad Pathol       Date:  2017-06-27
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.