Literature DB >> 32566768

A prioritization framework for the analysis of near misses in radiation oncology.

Brian Liszewski1,2.   

Abstract

INTRODUCTION: The term near miss implies the aversion of a harm event but often there is a lack of evidence when establishing a link between a failure in process and potential harm. The focus of this study was to use reported incident data to inform a prioritization framework for the triage of near miss events in a radiation therapy program.
MATERIALS AND METHODS: Actual and near miss events during the study period were categorized using thematic analysis based on incident types. Near miss were characterized based upon their potential to result in harm to the patient using the concepts of failure modes and Analytic Hierarchy Process (AHP) theory. Near miss events were assessed for occurrence, detection and the potential impact and then assigned a summative normalized score reflecting prioritization recommendations, the normalized 10 point score (NTPS).
RESULTS: 107 events were reported within the study timeframe. 65% of event type categories (n = 20) were attributed to near misses. 107 total events we analyzed using the framework with a maximum NTPS of 4 achieved across all event types. Of the 47 actual events 100% received a NTPS of 3 or greater. Of the 60 near miss invents 47% received an NTPS less than or equal to 1. Finally 15% of near miss events received a NTPS of 3 or greater.
CONCLUSIONS: Near miss events provide a unique opportunity for learning however, can yield a great deal of data potentially limiting the resources for effective incident learning. A FMEA and AHP based prioritization framework for the triage of near miss events, including the likelihood of occurrence, probability of the event to go undetected and the potential impact if the incident did occur, allows for the optimal focus of programmatic resources in the analysis of these events.
© 2020 The Author.

Entities:  

Keywords:  Failure modes; Incident learning; Near miss; Quality; Radiation therapy; Safety

Year:  2020        PMID: 32566768      PMCID: PMC7296427          DOI: 10.1016/j.tipsro.2020.04.001

Source DB:  PubMed          Journal:  Tech Innov Patient Support Radiat Oncol        ISSN: 2405-6324


Introduction

Failure modes effect analysis (FMEA) is a prospective methodology to assess the robustness of a system. The methodology maps the process and identifies steps at which potential failures can occur [1]. These failure modes are characterized by several features including; what is the likelihood that this failure would occur, if it did occur what would be the potential impact and the probability that the failure would go undetected [2]. Each characterization is assigned a measure which collectively provides insight into the processes most vulnerable to failure and opportunities for improvement [3]. To evaluate multiple criteria described in this methodology a systematic process to describe the contribution of each element is necessary. Analytic Hierarchy Process (AHP) theory, described by Thomas L. Saaty is a methodology to develop priority scales based on expert input [4]. AHP is a tool used to break down the measures that contribute to a decision into a number of easily comprehended sub-criteria, using pairwise comparison of measured or subjective inputs [5]. Incident learning is a retrospective process to building a robust system. Reporting and recognizing the opportunities for programmatic improvement through the iterative learning from adverse events and near misses is key to a successful incident learning program [5]. A near miss, as defined by Kaplan, is a condition in which “the potential for harm may have been present, but unwanted consequences were prevented because some recovery action was taken” [6]. Variations on this definition exist within all major national and international incident reporting systems (European Radiation Oncology Safety Education Information System: ROSEIS, American Radiation Oncology Incident Learning System: RO-ILS and Canadian National System for Incident Reporting in Radiation Therapy: NSIR-RT). In 1931 Heinrich was the first author to correlate the pathways of near misses and actual incidents [7]. The term near miss implies the prevention of a harm event but often there is a lack of evidence establishing a link between a failure in process and potential harm. The understanding that mitigating the cause of near misses will in turn prevent actual events, in the health care domain, is based primarily on anecdotal evidence [8]. In addition, near-miss events are much more common than adverse events, evidenced in a magnitude of 7–100 times more frequent [9]. Therefore, near miss reporting can yield a great deal of data, potentially limiting programmatic resources available for incident learning. To effectively analyze the data in a manner to inform the priorities of the program a methodology is required to identify near miss events, which if mitigated may in turn prevent potential harm in the future. Near miss events share similar characteristics with those used in the evaluation of FMEA. The likelihood that the failure would occur, if it did occur what would be the potential impact and what are the odds that the failure would go undetected can be used to describe a near miss event. The focus of this paper is to describe an incident prioritization framework for the triage of radiation therapy incidents, leveraging the concepts of FMEA and AHP. By evaluating the framework through its application of a radiation therapy program’s reported incident dataset. The prioritization framework will identify events of greatest risk for the program to focus resources for analysis and mitigation.

Methods

Analytic Hierarchy Process (AHP) theory was used to characterize the failure modes (criteria) and sub-criteria which contribute the FMEA decision making process, the likelihood that this failure would occur, if it did occur what would be the potential impact and the probability that the failure would go undetected. Table 1. Describes the sub-criteria used to characterize each failure mode and their related contributing factors.
Table 1

Criteria and sub-criteria for analytic hierarchy process prioritization.

CriteriaFailure May OccurPotential ImpactFailure Would Go Undetected
Sub-Criteria 10–15% of program incidentsNo harm patient is asymptomatic and no treatment is required.Event was detected because it was undeliverable
Sub-Criteria 216–30% of program incidentsMild symptoms, if present, are mild; no or minimal intervention (observation, investigation, minor treatment) is required; harm or loss of function is minimal or intermediate but short term.Event was detected at 1st QA
Sub-Criteria 331–45% of program incidentsModerate patient is symptomatic, requiring intervention (additional treatment or operative procedure) or a prolonged hospital stay; long-term or permanent harm or loss of function.Event was detected at pre-treatment QA
Sub-Criteria 446–60% of program incidentsSevere patient is symptomatic, requiring life-saving intervention or a major surgical/medical intervention; shortened life expectancy or major long-term or permanent harm or loss of function.Event was detected at on-treatment QA
Sub-Criteria 561–75% of program incidentsDeath on the balance of probabilities, death was caused or brought forward in the short term by the incident.Treatment delivery
Sub-Criteria 676–90% of program incidentsEvent was detected at sporadic QA
Sub-Criteria 791–100% of program incidents
Criteria and sub-criteria for analytic hierarchy process prioritization. When describing an event, the likelihood that this failure would occur was described as the percentage of program incidents. The potential impact was described using the acute harm scale as defined by the NSIR-RT minimum dataset. [10]. The probability that the failure would go undetected was described using the points throughout the planning-treatment process at which an event could be detected. Expert opinion was used to describe the relative importance of each criteria with respect to one another. For example the potential impact was identified as six times more important than probability of failure and four times more important than probability of detection. The same process was completed for each set of criteria and set of sub-criteria described in Appendix I. Using Eigen vector and Eigen value calculation, described in Appendix II, relative weights for each criteria and sub-criteria were derived [4]. Reported incident events were reviewed during a one-year period from a large metropolitan cancer program. Actual and near miss events during the study period were categorized using thematic analysis based on incident types. Near miss events were extracted for further review and were characterized based upon their potential to result in harm to the patient using the concepts of failure modes. Near miss events were assessed for their likelihood of occurrence, probability to go undetected and the potential impact if the incident did occur. Likelihood of occurrence was assessed using descriptive statistics of near miss event types during the study period. To assess the probability of the failure going undetected near miss events were characterized based on the barrier step at which the event was detected. Finally, the potential impact if the incident did occur was described through a comparison of actual events extracted for the study period. Near miss events that shared incident type classifications with actual incidents were assigned the same acute toxicity outcomes with respect to the potential for impact to the patient. Based on the criteria above each incident was assigned a percentage of program incidents, acute harm scale value and a point along the planning-treatment process at which an event could be detected. Applying the relative weights for each sub-criteria as determined by the AHP analysis a score for each criteria (likelihood of occurrence, probability to go undetected and potential impact) was derived. The summative normalized score were used to develop prioritization recommendations with respect to the incident classification types best suited for further investigation.

Results

Incidence

107 events were reported within the study timeframe of which 60 near miss events were identified. A summary of the near miss and actual incident event thematic analysis classification is shown in Fig. 1.. 65% of event type categories were attributed to near misses. The most common near miss event was “referencing” referring to referencing to tattoos in documentation (i.e. setup point is 0.5 cm superior to tattoo). The most common actual incident “bolus omitted” referred to the omission of the accessory during treatment.
Fig. 1

Thematic analysis: Near miss and actual event classification.

Thematic analysis: Near miss and actual event classification.

Detection

The barrier step at which the event was detected was described using six sub-criteria. The number of near miss events stratified by barrier step is described in Table 2. 20 events were deemed undeliverable and therefore could not result in patient harm. The remaining 40 events were detected along the planning-treatment trajectory. No near miss events were identified via sporadic Quality Assurance (QA), an event not part of a defined QA process.
Table 2

Near miss events by barrier step.

Barrier Step# of Events
Undeliverable20
1st QA9
Pre-Treatment QA16
On-Treatment QA11
Treatment Delivery4
Sporadic QA0
Near miss events by barrier step.

Potential impact

Of the 60 near miss events identified 29 were identified as having a no potential impact with respect to acute medical harm, 30 were identified as the potential to cause mild harm and one was identified as having the potential for moderate harm.

Analytic Hierarchy process analysis

All 107 event types, near miss and actual incidents were evaluated using the AHP tool. The normalized 10 point score (NTPS) represents the score given to an incident relative to the worst case scenario (the most common incident, with a potential impact of death and caught via an undefined sporadic QA process). Table 3 describes the events with the highest and lowest NTPS. These represent event types with the greatest and least programmatic risk. Under the event type category “Wrong Setup” which is one of the least common event type an NTPS scored of 4 was assigned. The near miss was not detected until treatment delivery and similar “Wrong Setup” treatment incidents have resulted in moderate patient harm. Conversely, the event type category “Mislabeled” was assigned the score of 4. This event type was the second most common, however, has not been associated with patient harm and was identified within the first QA barrier.
Table 3

Event classification by criteria and normalized 10 point score.

Event #Event TypeFailure May OccurPotential ImpactFailure Would Go UndetectedNormalized 10 Point Score
ActualFractionation Issues0–15%MildSporadic QA4
ActualAccessory Issue0–15%MildSporadic QA4
ActualProtocol Not Followed0–15%MildSporadic QA4
ActualTx Technique Not Completed0–15%MildSporadic QA4
Near MissWrong Setup0–15%ModerateTreatment Delivery4
Near MissF/S Error0–15%Mild1st QA1
Near MissReferencing46–60%MildUndeliverable1
Near MissMislabeled16–30%No HarmPre-Treatment1
Near MissAccessory0–15%No HarmPre-Treatment1
Near MissPatient Related Circumstances0–15%MildUndeliverable1
Near MissMislabeled16–30%No Harm1st QA1
Near MissMislabeled16–30%No HarmUndeliverable1
Near MissIncorrect Iso Slice Marked0–15%No HarmUndeliverable1
Near MissSites Omitted0–15%No HarmUndeliverable1
Near MissAccessory0–15%No HarmUndeliverable1
Near MissUndeliverable0–15%No HarmUndeliverable1
Near MissTattoo Cannot be Found0–15%No HarmUndeliverable1
Near MissReference Image Omitted0–15%No HarmUndeliverable1
Near MissF/S Error0–15%Mild1st QA1
Near MissReferencing46–60%MildUndeliverable1
Near MissMislabeled16–30%No HarmPre-Treatment1
Near MissAccessory0–15%No HarmPre-Treatment1
Event classification by criteria and normalized 10 point score.

Analytic hierarchy process analysis

107 total events we analyzed using the framework of which a maximum normalized 10 point score of 4 was assigned. Of the 47 actual events 100% received a normalized 10 point score of 3 or greater. Of the 60 near miss invents 47% received an NTPS less than or equal to 1. Finally 15% of near miss events received a NTPS of 3 or greater.

Discussion

Van der Schaff et al. (1991) sought to distinguish the goals of near miss learning from traditional incident analysis. Identifying three goals he described a qualitative, quantitative and intangible aspect to the reporting. Qualitatively near miss reporting provides insight into how errors develop and might potentially lead to adverse events. Quantitatively near miss reporting allows a program to build a database of root causes of near misses. Trending this database provides a way to target the most prominent factors as possible targets for error-reduction. Finally, near miss reporting serves as a reminder that safety risks continue to exist and the continued need for staff vigilance [9]. It is evident that near miss events provide a unique opportunity for learning and should be leveraged for programmatic gain. Lam et al. examined radiation therapy near miss and actual incidents based on incident type and stage of origin. They found near miss events shared different characteristics than actuals concluding the traditional practice of analyzing and managing these events in a similar manner is not the optimal approach to managing risk in radiotherapy. Furthermore, the authors noted near miss events reflected different failure modes than actual incidents and utilizing different approaches for analysis could glean valuable information for learning. [11]. Another study conducted by Bates et al. examined near miss and actual events in the medication setting. It was evidenced that the characteristics of near misses were different than those of actual events. Although both incident types had the same underlying cause, near misses involving a modest overdose were more likely to result in harm than errors involving massive overdoses, since the near miss actions were more likely to be carried out [12]. Wright et al. reviewed 1351 reported events within a radiation oncology program, classifying the events as either workflow or near miss events. In addition, the authors assigned a Risk Priority Number (RPN), numeric assessment of risk assigned to a process as part of Failure Modes and Effects Analysis (FMEA) to near miss events. It was found that events that had originated and were detected in the treatment delivery stage had the greatest mean overall RPN and were therefore associated with the greatest risk [13]. Although near miss events provide a unique opportunity for learning how these events should be used and best analyzed continues to be explored. In the study conducted by Wright et al. of the 1351 reported events 51 (3.8%) were categorized as proper near miss events, the remainder being work flow related events [13]. As previously mentioned near misses have a frequency of 7 to 100 times the incidence of actual incidents [9]. To fulfill the goals described by Van der Schaff et al., near miss events should be captured in the incident learning system (available for analysis and learning). How the near misses are used to support program decision regarding how and where to invest quality and safety improvements requires consideration. Near miss events should be linked to the causal continuum, providing an association with the event and evidenced potential for harm. The application of the framework described in this paper provides a mechanism to objectively prioritize events including near misses with an evidenced potential for harm, risk of incidence and risk of occurrence.

Conclusion

Near miss events provide a unique opportunity for learning and should be leveraged for programmatic gain. Near miss reporting, however, can yield a great deal of data potentially limiting the resources available to effectively analyze the data in a manner to inform programmatic priorities. Failure modes effect analysis (FMEA) identifies steps at which potential failures can occur. Analytic Hierarchy Process (AHP) theory synthesizes multiple objective and subjective inputs to develop evaluation scales. Together these tools create a prioritization framework capable of integrating the number of factors associated with incident reporting. As described this framework objectively evaluates events using the criteria of occurrence, probability of detection and potential impact to assign events a relative priority. This priority can be further evaluated by programs to assess the allocation of resources for incident learning and mitigation resources.

Declaration of Competing Interest

The authors declare that there is no conflict of interest regarding the publication of this article.
ItemFailure may occurPotential ImpactFailure would go undetected
Failure may occur10.1666666670.25
Potential Impact612
Failure would go undetected40.51
SUM111.6666666673.25
ItemFailure may occurPotential ImpactFailure would go undetectedWeight
Failure may occur0.0909090910.10.0769230778.93%
Potential Impact0.5454545450.60.61538461558.69%
Failure would go undetected0.3636363640.30.30769230832.38%
Item0–15%16–30%31–45%46–60%61–75%76–90%91–100%
0–15%10.250.20.1666670.1428570.1250.111111
16–30%410.250.20.1666670.1428570.125
31–45%5410.250.20.1666670.142857
46–60%65410.250.20.166667
61–75%765410.250.2
76–90%8765410.25
91–100%9876541
SUM4031.2523.4516.6166710.759525.8845241.995635
Item0–15%16–30%31–45%46–60%61–75%76–90%91–100%Weight
0–15%0.0250.010.008530.010030.013270.021240.055682.03%
16–30%0.10.030.010660.012030.015490.024280.062633.67%
31–45%0.1250.120.042640.015050.018590.028320.071596.13%
46–60%0.150.160.170570.060180.023240.033990.083519.74%
61–75%0.1750.190.21320.240720.092940.042480.1002215.09%
76–90%0.20.220.255860.300900.371760.169930.1252723.54%
91–100%0.2250.250.298500.361080.464700.679750.5010939.80%
ItemNo HarmMildModerateSevereDeath
No Harm10.250.20.1666670.142857
Mild410.250.20.166667
Moderate5410.250.2
Severe65410.25
Death76541
SUM2316.2510.455.6166671.759524
ItemNo HarmMildModerateSevereDeathWeight
No Harm0.0434780.0153850.0191390.0296740.0811913.78%
Mild0.1739130.0615380.0239230.0356080.0947237.79%
Moderate0.2173910.2461540.0956940.044510.11366714.35%
Severe0.260870.3076920.3827750.1780420.14208425.43%
Death0.3043480.3692310.4784690.7121660.56833648.65%
ItemUndeliverable1st QAPre-Treatment QAOn-Treatment QATreatment DeliverySporadic QA
Undeliverable10.250.20.1666670.20.166667
1st QA410.250.20.1666670.142857
Pre-Treatment QA5410.250.3333330.25
On-Treatment QA65410.50.333333
Treatment Delivery563210.5
Sporadic QA674321
SUM2723.2512.456.6166674.22.392857
ItemUndeliverable1st QAPre-TreatmentOn-Treatment QATreatment DeliverySporadic QAWeight
Undeliverable0.037030.010750.016060.025190.047620.069653.44%
1st QA0.148150.0430110.020080.030220.039680.059705.68%
Pre-Treatment0.185190.1720430.080320.037780.079370.1044811%
On-Treatment QA0.222220.215050.321290.151130.119040.139319.5%
Treatment Delivery0.185190.258070.240960.302270.23810.2089623.9%
Sporadic QA0.222220.301080.321290.45340.47620.417936.5%
Sum of columns Normalize and calculate first normalized principal Elgen vector x1N=1/sumcol1a/sumcol2b/sumcol31/a/sumcol11/sumcol2c/sumcol31/b/sumcol11/c/sumcol21/sumcol3,x1=nrow1nnrow2nnrow3n



Calculate the largest Elgen value λλ=sumCol1x1+sumCol2x2+sumCol3x3



Calculate Consistency Indexcl=λ-nn-1



Verify Consistency Ratio <10%CR=CIRI
n123456789101112131415
RI000.580.91.121.241.321.411.451.491.511.541.561.571.59
  5 in total

Review 1.  Accident versus near miss causation: a critical review of the literature, an empirical test in the UK railway domain, and their implications for other sectors.

Authors:  Linda Wright; Tjerk van der Schaaf
Journal:  J Hazard Mater       Date:  2004-07-26       Impact factor: 10.588

2.  IOM report: patient safety--achieving a new standard for care.

Authors: 
Journal:  Acad Emerg Med       Date:  2005-10       Impact factor: 3.451

Review 3.  The Quest for Quality: Principles to Guide Medical Radiation Technology Practice.

Authors:  Caitlin Gillan; Carol-Anne Davis; Kathryn Moran; John French; Brian Liszewski
Journal:  J Med Imaging Radiat Sci       Date:  2015-12

4.  Real-time management of incident learning reports in a radiation oncology department.

Authors:  Jean L Wright; Arti Parekh; Byung-Han Rhieu; Valentina Opris; Annette Souranis; Amanda Choflet; Akila N Viswanathan; Theodore L DeWeese; Todd McNutt; Stephanie A Terezakis
Journal:  Pract Radiat Oncol       Date:  2018-05-08

5.  Relationship between medication errors and adverse drug events.

Authors:  D W Bates; D L Boyle; M B Vander Vliet; J Schneider; L Leape
Journal:  J Gen Intern Med       Date:  1995-04       Impact factor: 5.128

  5 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.