Literature DB >> 31183440

Systematic review of team performance in minimally invasive abdominal surgery.

W J van der Vliet1, S M Haenen1, M Solis-Velasco2, C H C Dejong1, U P Neumann1,3, A J Moser2, R M van Dam1.   

Abstract

Background: Adverse events in the operating theatre related to non-technical skills and teamwork are still an issue. The influence of minimally invasive techniques on team performance and subsequent impact on patient safety remains unclear. The aim of this review was to assess the methodology used to objectify and rate team performance in minimally invasive abdominal surgery.
Methods: A systematic literature search was conducted according to the PRISMA guidelines. Studies on assessment of surgical team performance or non-technical skills of the surgical team in the setting of minimally invasive abdominal surgery were included. Study aim, methodology, results and conclusion were extracted for qualitative synthesis.
Results: Sixteen studies involving 677 surgical procedures were included. All studies consisted of observational case series that used heterogeneous methodologies to assess team performance and were of low methodological quality. The most commonly used team performance objectification tools were 'construct'- and 'incident'-based tools. Evidence of validity for the assessed outcome was spread widely across objectification tools, ranging from low to high. Diverse and poorly defined outcomes were reported.
Conclusion: Team demands for minimally invasive approaches to abdominal procedures remain unclear. The current literature consists of studies with heterogeneous methodology and poorly defined outcomes.

Entities:  

Mesh:

Year:  2019        PMID: 31183440      PMCID: PMC6551413          DOI: 10.1002/bjs5.50133

Source DB:  PubMed          Journal:  BJS Open        ISSN: 2474-9842


Introduction

A substantial contribution to morbidity among surgical patients can be attributed to adverse events occurring in the operating theatre1. Increasing evidence shows that a considerable portion of these adverse events cannot be attributed solely to deficient technical skills2, 3, 4. Adverse events related to non‐technical skills and team performance are common and estimated to be twice as frequent as errors in surgical technique5. Poor teamwork and lack of vigilance appear to be essential factors influencing procedural flow and increasing error rates6. Surgical teams demand specific infrastructure, resources and competencies to perform effectively and maintain patient safety7. Effective team performance depends on physical and social interactions, including back‐up behaviour and leadership8. These demands, competencies and interactions encompassing effective team performance create a domain that is difficult to objectify and quantify7, 9. In recent years, minimally invasive techniques have become the benchmark for a large number of abdominal surgical procedures10. These approaches introduce complex equipment, increased numbers of instrument changes and larger teams to the operative environment, resulting in increased demands in levels of coordination, anticipation, planning and communication11. The impact of the variation in procedural approaches on team demands, error rates and patient safety remains unclear. In highly complex abdominal procedures, associated with learning curves for surgical technique, minimally invasive approaches could also have a significant impact on team performance and non‐technical skills12. Recent studies have used a variety of methodologies to observe and objectify team performance. Consensus on the most efficient and methodically correct way of analysing and rating effective team performance is yet to be reached. The development of benchmarks for team observations and assessment of team performance will allow an accurate comparison of demands relative to surgical techniques. This will facilitate the development of effective, evidence‐based training programmes for surgical teams, directed to increase team performance, decrease error rates and increase patient safety. The aim of this systematic review was to assess the methodology used to objectify surgical team performance in minimally invasive abdominal surgery and explore team demands in relation to non‐technical skills.

Methods

This study was performed according to the PRISMA guidelines13. Two researchers were involved in the search, inclusion, critical appraisal and data extraction of the articles selected for this study.

Eligibility criteria

Studies on assessment of surgical team performance and non‐technical skills of the entire surgical team (including surgeons, anaesthesia and nursing staff) in the setting of minimally invasive abdominal surgery were included. Abdominal surgery was defined as any urological, gynaecological or general surgical procedure performed intra‐abdominally. Minimally invasive techniques consisted of minimal‐access approaches to the abdominal cavity, including laparoscopic, video‐ or robot‐assisted methods. Exclusion criteria consisted of non‐original research, research performed in a simulated environment or non‐human subject research, and language of publication other than English.

Study selection

Two authors performed a systematic literature search using PubMed, Embase, the Cochrane Library and Google Scholar to identify articles published before 11 October 2017. Search terms were based on subject (‘teamwork’, ‘team learning’, ‘team efficiency’, ‘non‐technical skills’) and setting (‘minimally invasive abdominal surgery’, ‘laparoscopic surgery’, ‘robotic surgery’). After the initial search, duplicates and non‐English studies were removed. Articles were screened for eligibility by title, abstract and then full text. Reference lists and citations of the included studies were screened for missed articles. Discrepancies in study selection between the two authors were discussed with other review team members until consensus was reached.

Critical appraisal

The methodological quality of the included studies was assessed using the Oxford Centre for Evidence‐Based Medicine levels of evidence, ranging from 1 (systematic review of RCTs) to 5 (expert opinion)14. Evidence of validity of the tools used to objectify and rate operative team performance was assessed using Messick's framework15, where a test should be rated for construct validity in each specific context in which the test is employed, defining five sources of evidence (content, process response, internal structure, relations to other variables, and consequences) each rated on a three‐point scale16. Evidence of validity was classified as low (0–5), moderate (6–10) or high (11–15).

Data collection

The following data were extracted from the included studies: aim, design, setting, studied procedures, observational method, observer characteristics, outcomes, conclusion, and assessment of team performance and non‐technical skills.

Results

The systematic search identified 2591 manuscripts, from which 69 duplicates and 193 non‐English publications were removed. Based on screening of title and abstract a further 2198 articles were excluded. The remaining 138 full‐text publications were reviewed, resulting in the selection of seven studies. Subsequent review of citations and references lists led to the inclusion of a further nine articles, so that a total of 16 studies were finally included in this systematic review (Fig. 1).
Figure 1

Flow chart showing selection of articles for review

Flow chart showing selection of articles for review

Study aims, designs and settings

All included studies consisted of single‐centre observational series, with the exception of one dual‐centre series17 of four robot‐assisted procedures. Two studies18, 19 investigated the influence of a non‐technical skills training intervention on surgical team performance. Four studies19, 20, 21, 22 consisted of subanalyses of results from observed cohorts published in previous work (Table  S1, supporting information). According to the Oxford Centre for Evidence‐Based Medicine, all studies provided level 4 evidence (case series, poor‐quality cohort studies, case–control studies). No subgroup analysis or meta‐analysis of outcomes data was attempted due to heterogeneous methodologies and outcome measurements across the included studies. All studies observed complete surgical teams (including surgery, anaesthesia and nursing staff) with the goal of evaluating team performance through surgical workflow analysis and evaluation of disruptive events (9 studies)20, 23, 24, 25, 26, 27, 28, 29, 30, the relationship between team performance and technical outcomes (3 studies)18, 21, 22, the relationship between anticipation of surgical steps and team efficiency (1 study)31 or novel tools to rate team performance (3 studies)17, 19, 32. Of the 16 studies, 11 focused exclusively on minimally invasive procedures, whereas five included both open and minimally invasive approaches to abdominal surgical procedures. Ten studies investigated laparoscopic techniques and six a robot‐assisted approach. A total of 677 procedures (281 laparoscopic, 236 robot‐assisted and 160 open) were observed across the included studies, with a mean of 42.3 operations per study (Table  S1, supporting information).

Observational methodology

The majority of studies (12) observed team performance directly, and the four remaining studies performed a postoperative review of audiovisual recordings. Most (13) used multiple observers to evaluate team performance, with 14 reporting on methodological training of observers before the study and five including experts trained in human factor assessment or psychologists in their observing teams. Nine studies quantified interobserver reliability using a variety of methodologies; reliability was deemed good to excellent (Table  S1, supporting information). Most studies (14) observed team performance for the entire duration of the patient being present in the operating theatre. Six studies subdivided the procedure into preoperative, intraoperative and postoperative phases, of which four also defined a robot‐docking phase.

Assessment of team performance

Seven studies used ‘construct‐based’ team performance assessment tools that rated a number of behaviour constructs to create an overall score at the end of a case. Construct‐based tools included: Oxford Non‐Technical Skills (NOTECHS)19 (4 studies) or Observational Teamwork Assessment for Surgery (OTAS)33 (3 studies). These tools contained moderate evidence of validity for the assessed outcome according to Messick's framework, with a range of 8–10 of 15 (Table  1). Nine studies used an ‘incident‐based’ team performance assessment methodology, classifying non‐technical procedural errors or disruptions of surgical flow in order of causation34. The evidence of validity for these outcome tools was spread widely, ranging from low to high (4–12 of 15) (Table  1).
Table 1

Outcome assessment tool validity according to Messick's framework of validity

Source of validity
Outcome assessment toolContentResponse processInternal structureRelation to other variablesConsequenceTotal 
Construct
NOTECHS
Catchpole et al.21 330219
Mishra et al.22 321118
McCulloch et al.18 331119
Mishra et al.19, * 3312110
OTAS
Mishra et al.19, * 311229
Healey et al.25 320128
Undre et al.28 320128
Incident
NOPE
McCulloch et al.18 301116
OR distraction assessment form
Healey et al.25 220116
Flow disruptions
Catchpole et al.24 3322212
Catchpole et al.20, 3320210
Jain et al.27 3321211
Zheng et al.30 111104
Allers et al.23 111104
Weigl et al.29 320106
Interference assessment form
Healey et al.26 221117
Technical
OCHRA
Catchpole et al.21 320218
Mishra et al.22 321129
Mishra et al.19, * 320218
OTE
McCulloch et al.18 010102
Workload
NASA‐TLX
Allers et al.23 010102
Sexton et al.31 3330110
SURG‐TLX
Weigl et al.29 330118

Subanalysis of observational data from McCulloch et al.18;

subanalysis of observational data from Catchpole et al.24. NOTECHS, Oxford Non‐Technical Skills; OTAS, Observational Teamwork Assessment for Surgery; NOPE, non‐operative procedural error; OCHRA, Observational Clinical Human Reliability Assessment; OTE, Operative Technical Errors; NASA‐TLX, National Aeronautics and Space Administration – Task Load Index; SURG‐TLX, Surgery Task Load Index.

Outcome assessment tool validity according to Messick's framework of validity Subanalysis of observational data from McCulloch et al.18; subanalysis of observational data from Catchpole et al.24. NOTECHS, Oxford Non‐Technical Skills; OTAS, Observational Teamwork Assessment for Surgery; NOPE, non‐operative procedural error; OCHRA, Observational Clinical Human Reliability Assessment; OTE, Operative Technical Errors; NASA‐TLX, National Aeronautics and Space Administration – Task Load Index; SURG‐TLX, Surgery Task Load Index. No study used the same categories of surgical flow disruption. The most frequent flow disruption categories defined across the nine studies were related to equipment (8 studies), external factors (8), communication (6), supervision/training (5), environment (5) and procedure (5) (Tables  2 and 3).
Table 2

Categories of flow disruption

CategoryMcCulloch et al.18 Healey et al.25 Catchpole et al.24 Catchpole et al.20, * Jain et al.27 Zheng et al.30 Allers et al.23 Healey et al.26 Weigl et al.29 Total
AbsenceX1
CommunicationXXXXXX6
Case‐irrelevant communicationXXXX4
CoordinationXXXX4
Supervision/trainingXXXXX5
Psychomotor errorXXXX4
Resource managementX1
ProceduralXXXXX5
Planning problemX1
Surgeon decision‐makingXX2
Surgeon's position changeX1
External factorsXXXXXXXX8
External staffXXX3
EnvironmentXXXXX5
Duty shift of nursesX1
Interference of video monitorsXXX3
External resourceXX2
EquipmentXXXXXXXX8
Instrument changesXXX3
Robot switchX1
Patient factorsXXX3
Safety consciousnessX1
Vigilance/awarenessX1

Subanalysis of observational data from Catchpole et al.24.

Table 3

Explanation of flow disruption categories

CategoryExplanationExample
AbsenceTeam member not presentCirculating nurse out of theatre when needed
Psychomotor errorTask execution errorSterile instrument dropped on floor
Resource managementMisjudgement of team members' abilitySurgeon leaves assistant to finish without confirming ability to do so
ProceduralEvents intrinsic to the case workArterial clamp time not recorded
Planning problemKnown difficulty not taken into accountDifficult intubation anticipated but not prepared for consequences
Surgeon decision‐makingTechnical procedural planningPause to determine next surgical step
External factorsDistraction from outside the operating theatrePager causing distraction
External staffDisruption cause outside of surgical teamMedical student interference
EnvironmentRoom conditions impacting flowIncorrect room temperature
External resource problemOrganization outside the operating theatreEssential instrument missing from standard set
EquipmentEquipment malfunctionEnergy device not working
Robot switchRobotic instrument changeSwitch in controls on the robotic console
Safety consciousnessFailure to comply with safety protocolsTeam member not wearing face mask
Vigilance/awarenessFailure to notice impending danger or difficultiesFailure to note significant drop in arterial pressure
Categories of flow disruption Subanalysis of observational data from Catchpole et al.24. Explanation of flow disruption categories Three studies included workload assessments in their methodology consisting of the National Aeronautics and Space Administration – Task Load Index (NASA‐TLX)35 (2) or the Surgery Task Load Index (SURG‐TLX)36 (1). These tools contained low to moderate evidence of validity (2–10 of 15) for the assessed outcome. Four studies also assessed technical performance using the Observational Clinical Human Reliability Assessment (OCHRA)37 (3) and Operative Technical Errors (OTE)18 (1) tools, which both have low to moderate evidence of validity (2–9 of 15).

Team performance relative to surgical approach

Three studies26, 28, 29 compared team demands and/or performance in relation to a laparoscopic or open approach to abdominal procedures, reporting different results. No study compared robot‐assisted techniques with other approaches. Five studies investigated robot‐assisted techniques and found that this approach to abdominal procedures increases team demands24 that surgical teams were not always able to address effectively20, resulting in increased operating times27. The identification and analysis of flow disruptions can provide an evidence base for improving the efficiency and safety of robot‐assisted procedures23, 31 (Table  S1, supporting information).

Discussion

The primary determinants of surgical outcome are generally perceived to be the patient's condition and the performance of the individual surgeon. Once corrected for patient risk factors, surgeons' technical skills are held accountable for variation in outcome. A number of different factors are important in achieving safe and effective surgical care, including infrastructure, equipment and surgical team performance38. A minimally invasive surgical procedure is conducted in a sophisticated environment combining patient factors, complex equipment and a large number of individuals set to do independent and team‐based tasks. This systematic review included 16 studies objectifying and rating surgical team performance during minimally invasive abdominal procedures. The studies were of low methodological quality, heterogeneous design, and utilized a number of different tools to objectify team performance. In four studies, data were obtained via audiovisual recordings of the surgical environment. Despite apparent benefits of reviewing audiovisual recordings, the majority of studies collected their data through direct observation in the operating theatre. Benefits of data obtained through audiovisual recording include that data can be assessed by multiple, independent observers and incidents can be reviewed multiple times, increasing the validity and reliability of findings39. In addition, during direct observation the focus of the observer may decrease due to fatigue, potentially resulting in failure to record important events40. It is also possible that during direct observation findings are affected by the Hawthorne effect (change of behaviour in response to the awareness of being observed)41, 42. A prerequisite for accurate evaluation through review of audiovisual recordings is the quality of the recordings. Ethical considerations and potential hazards to team privacy and liability issues in case of adverse events may also be a limitation of its use43. Another source of variation across the reviewed studies was the number and type of observers collecting observational data. Interobserver reliability should be quantified to guarantee the quality of observations and objectivity of rating tools used44. This systematic review has demonstrated that, in the current literature, construct‐ and incident‐based team performance objectification tools are used most commonly. Construct‐based tools, including the OTAS and NOTECHS, rate a number of behavioural constructs on set Likert scales. These tools were developed for conventional approaches to surgery, providing global ratings for set constructs, and need to be validated for the identification of non‐technical skills in minimally invasive surgery. Although used by a number of studies with similar aims, studies used different categories of flow disruption, with a broad range of validity. Some variation in validity can be related to the nature of Messick's framework. According to Reason's organizational accident model, an adverse event is preceded by a chain of individually unimportant errors and/or latent threats that in sequence lead to an adverse event or breach of patient safety45. Incident‐based team performance objectification methodology can provide valuable insight into these patterns and the interplay of complex minimally invasive surgical equipment. None of the included studies was able to relate team performance to patient outcomes. This could be caused by insufficient power of individual studies. However, team performance as a determinant of morbidity and mortality is heavily biased by patient factors and technical performance. Surrogate markers for team performance could include operating time, intraoperative adverse events, and the number and duration of procedure flow disruptions. Larger, well designed studies are needed to display the true influence of minimally invasive techniques on team performance. The major limitation of this review was the number and quality of available studies, providing insufficient data for a subgroup analysis or meta‐analysis of outcomes. The majority of studies examined a heterogeneous group of operations, providing limited validity for the identification of unique non‐technical skills related to specific procedures. Future studies should therefore analyse multiple approaches (open, laparoscopic, robot‐assisted) in relation to a single procedure, use multiple trained observers to collect data, preferably from audiovisual recordings of the surgical environment, quantify interobserver reliability, objectify team performance using incident‐based methodology with a predefined outcome set including causation and consequences of procedural flow disruptions, and analyse team performance in relation to direct (operating time, intraoperative adverse events) and indirect (patient morbidity and mortality) performance metrics. Such well designed studies are needed to gain insight into team performance demands unique to minimally invasive surgery in order to develop structured, evidence‐based training programmes that enhance patient safety and procedural flow.

Disclosure

The authors declare no conflict of interest. Table S1 – Characteristics of included studies Click here for additional data file.
  40 in total

1.  Measuring intra-operative interference from distraction and interruption observed in the operating theatre.

Authors:  A N Healey; N Sevdalis; C A Vincent
Journal:  Ergonomics       Date:  2006 Apr 15-May 15       Impact factor: 2.778

2.  The effects of aviation-style non-technical skills training on technical performance and outcome in the operating theatre.

Authors:  P McCulloch; A Mishra; A Handa; T Dale; G Hirst; K Catchpole
Journal:  Qual Saf Health Care       Date:  2009-04

3.  Diagnosing barriers to safety and efficiency in robotic surgery.

Authors:  Ken R Catchpole; Elyse Hallett; Sam Curtis; Tannaz Mirchi; Colby P Souders; Jennifer T Anger
Journal:  Ergonomics       Date:  2017-03-08       Impact factor: 2.778

4.  Assessing team performance in the operating room: development and use of a "black-box" recorder and other tools for the intraoperative environment.

Authors:  Stephanie Guerlain; Reid B Adams; F Beth Turrentine; Thomas Shin; Hui Guo; Stephen R Collins; J Forrest Calland
Journal:  J Am Coll Surg       Date:  2005-01       Impact factor: 6.113

5.  Anticipation, teamwork and cognitive load: chasing efficiency during robot-assisted surgery.

Authors:  Kevin Sexton; Amanda Johnson; Amanda Gotsch; Ahmed A Hussein; Lora Cavuoto; Khurshid A Guru
Journal:  BMJ Qual Saf       Date:  2017-07-08       Impact factor: 7.035

6.  Barriers to efficiency in robotic surgery: the resident effect.

Authors:  Monica Jain; Brian T Fry; Luke W Hess; Jennifer T Anger; Bruce L Gewertz; Ken Catchpole
Journal:  J Surg Res       Date:  2016-07-04       Impact factor: 2.192

7.  Disruptions in surgical flow and their relationship to surgical errors: an exploratory investigation.

Authors:  Douglas A Wiegmann; Andrew W ElBardissi; Joseph A Dearani; Richard C Daly; Thoralf M Sundt
Journal:  Surgery       Date:  2007-11       Impact factor: 3.982

8.  Safety, efficiency and learning curves in robotic surgery: a human factors analysis.

Authors:  Ken Catchpole; Colby Perkins; Catherine Bresee; M Jonathon Solnik; Benjamin Sherman; John Fritch; Bruno Gross; Samantha Jagannathan; Niv Hakami-Majd; Raymund Avenido; Jennifer T Anger
Journal:  Surg Endosc       Date:  2015-12-16       Impact factor: 4.584

9.  Identification and categorization of technical errors by Observational Clinical Human Reliability Assessment (OCHRA) during laparoscopic cholecystectomy.

Authors:  B Tang; G B Hanna; P Joice; A Cuschieri
Journal:  Arch Surg       Date:  2004-11

10.  The Hawthorne Effect: a randomised, controlled trial.

Authors:  Rob McCarney; James Warner; Steve Iliffe; Robbert van Haselen; Mark Griffin; Peter Fisher
Journal:  BMC Med Res Methodol       Date:  2007-07-03       Impact factor: 4.615

View more
  2 in total

Review 1.  Factors affecting workflow in robot-assisted surgery: a scoping review.

Authors:  Jannie Lysgaard Poulsen; Birgitte Bruun; Doris Oestergaard; Lene Spanager
Journal:  Surg Endosc       Date:  2022-06-23       Impact factor: 4.584

2.  A Prospective, Observational, Multicentre Study Concerning Nontechnical Skills in Robot-assisted Radical Cystectomy Versus Open Radical Cystectomy.

Authors:  Alexander J W Beulens; Willem M Brinkman; Evert L Koldewijn; Ad J M Hendrikx; Jean Paul A van Basten; Jeroen J G van Merriënboer; Henk G Van der Poel; Chris H Bangma; Cordula Wagner
Journal:  Eur Urol Open Sci       Date:  2020-07-03
  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.