| Literature DB >> 36061265 |
Cynthia K Harris1, Yigu Chen2, Kristin C Jensen3, Jason L Hornick4, Claire Kilfoyle5, Laura W Lamps6, Yael K Heher1.
Abstract
The United States and Canadian Academy of Pathology (USCAP) leadership undertook a high level, global review of educational product outcomes data using high reliability organization (HRO) principles: preoccupation with failure; reluctance to simplify; sensitivity to operations; commitment to resilience; and deference to expertise. HRO principles have long been applied to fields such as aviation, nuclear power, and more recently to healthcare, yet they are rarely applied to the field that underpins these-and many other-complex systems: education. While errors in education are less calamitous than in air travel or healthcare delivery, USCAP's educational products impact over 15,000 learners a year, and thus have important implications for the future practice of pathology. Here we report USCAP's experiences using HRO principles to evaluate our keystone educational product, the "USCAP Short Course." Following this novel method of data review, USCAP leadership was able to better understand diverse learner needs based on practice venue, training level, and course topic. Unexpected lessons included the identification of specifically challenging educational topics, such as molecular pathology, and a need to focus more resources on emerging fields such as quality and patient safety. The results allow USCAP to assess educational product performance using HRO tools, and provide strong data-driven decision support for future national pathology education strategy.Entities:
Keywords: Education; HRO; High reliability organization; Quality and safety; Quality improvement; USCAP
Year: 2022 PMID: 36061265 PMCID: PMC9429554 DOI: 10.1016/j.acpath.2022.100048
Source DB: PubMed Journal: Acad Pathol ISSN: 2374-2895
Survey demographics.
| N | % | |
|---|---|---|
| Total short courses | 58 | |
| Participant status | 1072 | |
| Practicing pathologist | 981 | 91.5% |
| Pathologist-in-training | 80 | 7.5% |
| Other | 11 | 1.0% |
| Primary practice venue | 1071 | |
| University/Medical school | 492 | 45.9% |
| Community practice | 402 | 37.5% |
| Independent laboratory | 96 | 9.0% |
| Commercial laboratory | 31 | 2.9% |
| Other | 50 | 4.7% |
Fig. 1Results for USCAP annual meeting 2017 short course design, delivery, and outcomes, boxplot and violin charts. The distribution of responses for Q3 to Q22 use a combination of boxplots and violin charts. Boxplots display plain data distribution by outlining the mean, median, and interquartile ranges. Violin charts offer more detail on probability density using width and length (shape area). They are useful for showing multimodality in the dataset. The shaded blue and orange areas represent the density estimate of the responses: the more answers that fall in a specific range, the larger the violin shape for that range. Solid black dots represent the response means and dashed black lines represent medians.
Results for overall course design, delivery, and outcomes.
| Strongly Disagree = 1 | Disagree = 2 | Neutral = 3 | Agree = 4 | Strongly Agree = 5 | N | Mean | Median | Std | ||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 4.50 | 5 | 0.60 | ||||||||||||
| 03. Content was adequately described | 2 | 0.2% | 5 | 0.5% | 19 | 1.8% | 457 | 42.8% | 584 | 54.7% | 1067 | 4.51 | 5 | 0.58 |
| 04. Target audience was clearly defined | 1 | 0.1% | 2 | 0.2% | 29 | 2.7% | 465 | 43.6% | 570 | 53.4% | 1067 | 4.50 | 5 | 0.57 |
| 05. Learning objectives were clearly stated | 1 | 0.1% | 2 | 0.2% | 29 | 2.7% | 452 | 42.4% | 583 | 54.6% | 1067 | 4.51 | 5 | 0.57 |
| 06. Learning objectives were appropriate | 1 | 0.1% | 4 | 0.4% | 20 | 1.9% | 456 | 42.7% | 587 | 55.0% | 1068 | 4.52 | 5 | 0.57 |
| 07. Content was current and evidence-based | 2 | 0.2% | 2 | 0.2% | 25 | 2.3% | 440 | 41.2% | 598 | 56.0% | 1067 | 4.53 | 5 | 0.57 |
| 08. Syllabus was comprehensive and well-organized | 2 | 0.2% | 6 | 0.6% | 53 | 5.1% | 422 | 40.7% | 555 | 53.5% | 1038 | 4.47 | 5 | 0.64 |
| 09. I was encouraged to evaluate this session | 1 | 0.1% | 3 | 0.3% | 54 | 5.2% | 455 | 43.5% | 534 | 51.0% | 1047 | 4.45 | 5 | 0.62 |
| 10. Pace of this session was appropriate | 2 | 0.2% | 10 | 0.9% | 29 | 2.7% | 464 | 43.5% | 561 | 52.6% | 1066 | 4.47 | 5 | 0.62 |
| 11. Met the stated educational objectives | 1 | 0.1% | 8 | 0.7% | 36 | 3.4% | 446 | 41.8% | 577 | 54.0% | 1068 | 4.49 | 5 | 0.61 |
| 4.35 | 4 | 0.70 | ||||||||||||
| 12. Enhanced my current knowledge base | 1 | 0.1% | 13 | 1.2% | 30 | 2.8% | 453 | 42.5% | 570 | 53.4% | 1067 | 4.48 | 5 | 0.62 |
| 13. Helped me create/revise protocols, policies, and or procedures | 3 | 0.3% | 14 | 1.3% | 122 | 11.7% | 447 | 42.9% | 457 | 43.8% | 1043 | 4.29 | 4 | 0.74 |
| 14. Addressed professional “practice gaps” in my knowledge base | 2 | 0.2% | 9 | 0.8% | 84 | 7.9% | 490 | 46.2% | 476 | 44.9% | 1061 | 4.35 | 4 | 0.68 |
| 15. Provided me with information/knowledge to close the practice gap | 2 | 0.2% | 11 | 1.0% | 76 | 7.2% | 494 | 46.6% | 476 | 44.9% | 1059 | 4.35 | 4 | 0.68 |
| 16. learned a skill that I need to improve/change my practice | 2 | 0.2% | 14 | 1.3% | 104 | 9.8% | 480 | 45.3% | 460 | 43.4% | 1060 | 4.30 | 4 | 0.71 |
| 17. High-quality course that I would recommend to colleagues | 4 | 0.4% | 16 | 1.5% | 56 | 5.3% | 452 | 42.5% | 535 | 50.3% | 1063 | 4.41 | 5 | 0.70 |
| 18. Contributed to enhanced competence as a health care provider | 2 | 0.2% | 9 | 0.8% | 58 | 5.5% | 490 | 46.3% | 500 | 47.2% | 1059 | 4.39 | 4 | 0.65 |
| 19. I feel I am a better pathologist | 2 | 0.2% | 13 | 1.2% | 73 | 6.9% | 486 | 46.0% | 483 | 45.7% | 1057 | 4.36 | 4 | 0.68 |
| 20. Provided knowledge, strategies, and skills to improve efficiency | 2 | 0.2% | 19 | 1.8% | 109 | 10.4% | 457 | 43.6% | 461 | 44.0% | 1048 | 4.29 | 4 | 0.74 |
| 21. To improve safety in my practice | 2 | 0.2% | 19 | 1.8% | 124 | 11.9% | 436 | 41.9% | 460 | 44.2% | 1041 | 4.28 | 4 | 0.76 |
| 22. To improve patient outcomes and satisfaction | 1 | 0.1% | 13 | 1.2% | 85 | 8.1% | 474 | 45.0% | 480 | 45.6% | 1053 | 4.35 | 4 | 0.69 |
Results for course design, delivery, and outcomes group by participant role.
| Practicing pathologist | Pathologist-in-training | p-value | Other | |||||||
|---|---|---|---|---|---|---|---|---|---|---|
| Mean | Std | N | Mean | Std | N | Mean | Std | N | ||
| 03. Content was adequately described | 4.51 | 0.59 | 975 | 4.58 | 0.52 | 80 | 0.35 | 4.36 | 0.50 | 11 |
| 04. Target audience was clearly defined | 4.50 | 0.57 | 975 | 4.54 | 0.59 | 80 | 0.57 | 4.36 | 0.50 | 11 |
| 05. Learning objectives were clearly stated | 4.51 | 0.57 | 975 | 4.51 | 0.60 | 80 | 0.97 | 4.36 | 0.50 | 11 |
| 06. Learning objectives were appropriate | 4.52 | 0.57 | 976 | 4.59 | 0.52 | 80 | 0.29 | 4.36 | 0.50 | 11 |
| 07. Content was current and evidence-based | 4.52 | 0.58 | 976 | 4.58 | 0.57 | 79 | 0.39 | 4.45 | 0.52 | 11 |
| 08. Syllabus was comprehensive and well-organized | 4.46 | 0.64 | 946 | 4.53 | 0.59 | 80 | 0.41 | 4.27 | 0.65 | 11 |
| 09. I was encouraged to evaluate this session | 4.45 | 0.61 | 955 | 4.47 | 0.67 | 80 | 0.72 | 4.36 | 0.50 | 11 |
| 10. Pace of this session was appropriate | 4.47 | 0.62 | 974 | 4.53 | 0.64 | 80 | 0.47 | 4.36 | 0.50 | 11 |
| 11. Met the stated educational objectives | 4.49 | 0.61 | 976 | 4.53 | 0.59 | 80 | 0.59 | 4.45 | 0.52 | 11 |
| 12. Enhanced my current knowledge base | 4.47 | 0.63 | 975 | 4.54 | 0.57 | 80 | 0.39 | 4.45 | 0.52 | 11 |
| 13. Helped me create/revise protocols, policies, and or procedures | 4.27 | 0.75 | 953 | 4.45 | 0.65 | 80 | 0.04 | 4.33 | 0.71 | 9 |
| 14. Addressed professional “practice gaps" in my knowledge base | 4.33 | 0.68 | 970 | 4.50 | 0.64 | 80 | 0.04 | 4.40 | 0.70 | 10 |
| 15. Provided me with information/knowledge to close the practice gap | 4.34 | 0.68 | 969 | 4.47 | 0.64 | 80 | 0.09 | 4.44 | 0.73 | 9 |
| 16. learned a skill that I need to improve/change my practice | 4.29 | 0.72 | 970 | 4.45 | 0.63 | 80 | 0.05 | 4.44 | 0.73 | 9 |
| 17. High-quality course that I would recommend to colleagues | 4.40 | 0.70 | 973 | 4.52 | 0.64 | 79 | 0.14 | 4.50 | 0.53 | 10 |
| 18. Contributed to enhanced competence as a health care provider | 4.39 | 0.65 | 969 | 4.51 | 0.62 | 80 | 0.09 | 4.33 | 0.71 | 9 |
| 19. I feel I am a better pathologist | 4.35 | 0.69 | 968 | 4.46 | 0.64 | 79 | 0.19 | 4.33 | 0.71 | 9 |
| 20. Provided knowledge, strategies and skills to improve efficiency | 4.29 | 0.73 | 958 | 4.29 | 0.84 | 80 | 0.94 | 4.33 | 0.71 | 9 |
| 21. To improve safety in my practice | 4.28 | 0.75 | 951 | 4.30 | 0.83 | 80 | 0.81 | 4.33 | 0.71 | 9 |
| 22. To improve patient outcomes and satisfaction | 4.35 | 0.68 | 963 | 4.34 | 0.79 | 80 | 0.90 | 4.44 | 0.73 | 9 |
Fig. 2Results for USCAP annual meeting 2017 short course design, delivery, and outcomes grouped by participant role: trainee versus attending pathologist and other.
Results for course design, delivery, and outcomes group by primary practice venue.
| University/Medical school | Community practice | Independent Laboratory | Commercial Laboratory | P-value | Other | |||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Mean | Std | N | Mean | Std | N | Mean | Std | N | Mean | Std | N | Mean | Std | N | ||
| 03. Content was adequately described | 4.50 | 0.57 | 488 | 4.56 | 0.59 | 401 | 4.47 | 0.60 | 96 | 4.65 | 0.49 | 31 | 0.21 | 4.33 | 0.63 | 49 |
| 04. Target audience was clearly defined | 4.48 | 0.57 | 488 | 4.55 | 0.57 | 400 | 4.36 | 0.58 | 96 | 4.61 | 0.50 | 31 | 0.01 | 4.40 | 0.61 | 50 |
| 05. Learning objectives were clearly stated | 4.48 | 0.58 | 487 | 4.56 | 0.57 | 401 | 4.43 | 0.59 | 96 | 4.65 | 0.49 | 31 | 0.04 | 4.48 | 0.50 | 50 |
| 06. Learning objectives were appropriate | 4.50 | 0.55 | 488 | 4.56 | 0.57 | 401 | 4.44 | 0.59 | 96 | 4.65 | 0.49 | 31 | 0.09 | 4.44 | 0.61 | 50 |
| 07. Content was current and evidence-based | 4.51 | 0.58 | 487 | 4.57 | 0.58 | 401 | 4.41 | 0.59 | 96 | 4.65 | 0.49 | 31 | 0.05 | 4.50 | 0.54 | 50 |
| 08. Syllabus was comprehensive and well-organized | 4.45 | 0.64 | 469 | 4.50 | 0.64 | 393 | 4.38 | 0.64 | 95 | 4.61 | 0.50 | 31 | 0.21 | 4.38 | 0.64 | 48 |
| 09. I was encouraged to evaluate this session | 4.50 | 0.63 | 482 | 4.50 | 0.59 | 390 | 4.33 | 0.66 | 96 | 4.65 | 0.55 | 31 | 0.02 | 4.39 | 0.49 | 46 |
| 10. Pace of this session was appropriate | 4.48 | 0.59 | 486 | 4.52 | 0.64 | 401 | 4.31 | 0.65 | 96 | 4.58 | 0.72 | 31 | 0.03 | 4.36 | 0.56 | 50 |
| 11. Met the stated educational objectives | 4.47 | 0.61 | 488 | 4.54 | 0.60 | 401 | 4.39 | 0.62 | 96 | 4.61 | 0.62 | 31 | 0.08 | 4.40 | 0.64 | 50 |
| 12. Enhanced my current knowledge base | 4.46 | 0.62 | 487 | 4.53 | 0.62 | 401 | 4.33 | 0.66 | 96 | 4.61 | 0.50 | 31 | 0.02 | 4.42 | 0.64 | 50 |
| 13. Helped me create/revise protocols, policies, and or procedures | 4.27 | 0.74 | 479 | 4.33 | 0.74 | 394 | 4.06 | 0.80 | 93 | 4.71 | 0.46 | 31 | <0.001 | 4.23 | 0.77 | 44 |
| 14. Addressed professional “practice gaps” in my knowledge base | 4.33 | 0.67 | 485 | 4.38 | 0.68 | 401 | 4.18 | 0.71 | 93 | 4.71 | 0.46 | 31 | <0.001 | 4.33 | 0.69 | 49 |
| 15. Provided me with information/knowledge to close the practice gap | 4.33 | 0.66 | 485 | 4.39 | 0.68 | 401 | 4.17 | 0.73 | 93 | 4.71 | 0.46 | 31 | <0.001 | 4.34 | 0.70 | 47 |
| 16. learned a skill that I need to improve/change my practice | 4.29 | 0.71 | 485 | 4.35 | 0.71 | 401 | 4.08 | 0.77 | 93 | 4.68 | 0.48 | 31 | <0.001 | 4.27 | 0.68 | 48 |
| 17. High-quality course that I would recommend to colleagues | 4.40 | 0.67 | 487 | 4.46 | 0.70 | 400 | 4.14 | 0.80 | 94 | 4.65 | 0.61 | 31 | <0.001 | 4.41 | 0.57 | 49 |
| 18. Contributed to enhanced competence as a health care provider | 4.37 | 0.66 | 485 | 4.45 | 0.63 | 401 | 4.25 | 0.71 | 92 | 4.65 | 0.49 | 31 | 0.01 | 4.31 | 0.62 | 48 |
| 19. I feel I am a better pathologist | 4.32 | 0.69 | 481 | 4.41 | 0.68 | 401 | 4.23 | 0.72 | 94 | 4.65 | 0.49 | 31 | 0.01 | 4.33 | 0.66 | 48 |
| 20. Provided knowledge, strategies, and skills to improve efficiency | 4.27 | 0.75 | 479 | 4.33 | 0.72 | 399 | 4.26 | 0.77 | 92 | 4.52 | 0.81 | 31 | 0.23 | 4.18 | 0.75 | 45 |
| 21. To improve safety in my practice | 4.25 | 0.77 | 476 | 4.31 | 0.74 | 399 | 4.27 | 0.74 | 88 | 4.45 | 0.93 | 31 | 0.43 | 4.29 | 0.69 | 45 |
| 22. To improve patient outcomes and satisfaction | 4.31 | 0.72 | 481 | 4.41 | 0.65 | 400 | 4.27 | 0.66 | 92 | 4.58 | 0.67 | 31 | 0.03 | 4.28 | 0.71 | 47 |
Fig. 3Results for USCAP annual meeting 2017 short course design, delivery, and outcomes grouped by respondent primary practice venue.
Results for learner perception of overall faculty performance.
| Strongly Disagree = 1 | Disagree = 2 | Neutral = 3 | Agree = 4 | Strongly Agree = 5 | N | Mean | Median | Std | ||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 4.47 | 5 | 0.66 | ||||||||||||
| 23. This faculty member disclosed (before the presentation began) relevant financial | 1 | 0.0% | 3 | 0.1% | 49 | 1.8% | 1067 | 38.7% | 1637 | 59.4% | 2757 | 4.57 | 5 | 0.54 |
| 24. This faculty member clearly had expertise in the subject area | 1 | 0.0% | 3 | 0.1% | 59 | 2.1% | 1039 | 37.2% | 1688 | 60.5% | 2790 | 4.58 | 5 | 0.54 |
| 25. Slides and other educational materials were useful, effective, and appropriate | 3 | 0.1% | 12 | 0.4% | 99 | 3.6% | 1154 | 41.5% | 1513 | 54.4% | 2781 | 4.50 | 5 | 0.60 |
| 26. The instructor utilized non-conventional techniques (e.g. audience response systems) | 18 | 0.7% | 110 | 4.4% | 414 | 16.4% | 883 | 34.9% | 1102 | 43.6% | 2527 | 4.16 | 4 | 0.90 |
| 27. This faculty member maintained the audience's attention | 5 | 0.2% | 23 | 0.8% | 102 | 3.7% | 1124 | 40.4% | 1528 | 54.9% | 2782 | 4.49 | 5 | 0.63 |
| 28. This faculty member was enthusiastic | 4 | 0.1% | 13 | 0.5% | 113 | 4.1% | 1082 | 38.9% | 1570 | 56.4% | 2782 | 4.51 | 5 | 0.61 |
| 29. Overall, this faculty member is an effective teacher | 7 | 0.3% | 14 | 0.5% | 108 | 3.9% | 1104 | 39.8% | 1540 | 55.5% | 2773 | 4.50 | 5 | 0.62 |
Fig. 4Results for learner perception of faculty performance at USCAP 2017 annual meeting short courses visualized in boxplot and violin charts.
Fig. 5Scatter chart of total average learner perception of faculty performance score and average course content score.
Average quantified scores by course status and course topic for USCAP AM 2017 SCs.
| Course topic | ||||
|---|---|---|---|---|
| Molecular topics | Other topics | Overall | ||
| Course status | New (1st year) | 4.28 | 4.47 | 4.44 |
| Continued | 4.47 | 4.44 | 4.45 | |
| Overall | 4.42 | 4.45 | ||
Fig. 6USCAP Education short course educational product oversight organizational chart. The Education Committee (EC) oversees USCAP annual meeting short course content, faculty, quality, and integrity.
High reliability organization principles as applied to USCAP educational product improvement.
| HRO Principle | Definition | Conventional model | HRO-driven model | Outcome |
|---|---|---|---|---|
| Preoccupation with failure | Identify and analyze failures, rather than ignoring minor (or major) weaknesses | SC are generally well-received and renewed, barring major concerns | Analyze specific data outliers from broader trend of positive reception | Molecular pathology (especially new courses) and quality and safety identified as an area requiring further action |
| Reluctance to simplify | Embrace complexity; conduct root cause analyses; challenge long-held beliefs | A generally well received meeting means the educational product is acceptable, a poorly performing SC means a bad faculty lecturer | Analyze subgroups of learners; analyze course content quality versus teaching faculty quality, root cause analysis of “events” with educational product | Aggregate data on true root causes of poor educational product performance and act on information |
| Sensitivity to operations | Voices on frontline prioritized over those in leadership | Senior leaders audit portions of SC to vouch for quality | Learners directly surveyed on their experiences, feedback is analyzed | Accurate and meaningful information on the quality of the educational product from the learner perspective is collected and acted upon going forward |
| Commitment to resilience | Anticipating areas of trouble and recovering quickly | Feedback shared directly with teaching faculty with no discussion or action unless extenuating circumstances, SC (educational product) completes 3-yr cycle, then either extended or discontinued based on evaluations and faculty availability | Course planners partner with all faculty to make improvements annually using data gathered and root causes of challenges identified | Poorly performing educational products are continuously improved (before the following meeting) based on real, granular, actionable data directly obtained from learners |
| Deference to expertise | Expertise is prioritized over authority | Committee of faculty pathologists review evaluations and make recommendations for educational product and faculty members, audit courses, and make sure faculty receive comments from learners | Multidisciplinary committee comprised of pathologists, medical education consultants, business development staff, chief operating officer, quality and safety analysts, and others analyze educational product performance data, and provide actionable feedback to faculty, strategize about future courses | Future goal: continuous data feedback leads to thoughtful and deliberate designing of educational product with metrics for evaluation of outcomes on practice, continuous improvement of product delivered to learners |