OBJECTIVE: Human factors and teamwork are major contributors to sentinel events. A major limitation to improving human factors and teamwork is the paucity of objective validated measurement tools. Our goal was to develop a brief tool that could be used to objectively evaluate teamwork in the field during short clinical team simulations and in everyday clinical care. STUDY DESIGN: A pilot validation study. Standardized videos were created demonstrating poor, average, and excellent teamwork among an obstetric team in a common clinical scenario (shoulder dystocia). Three evaluators all trained in Crew Resource Management, and unaware of assigned teamwork level, independently reviewed videos and evaluated teamwork using the Clinical Teamwork Scale (CTS). Statistical analysis included calculation of the Kappa statistic and Kendall coefficient to evaluate agreement and score concordance among raters, and Interclass Correlation Coefficient (ICC) to evaluate interrater reliability. The reliability of the tool was further evaluated by estimating the variance of each component of the tool based on generalizability theory. RESULTS: There was substantial agreement (Kappa 0.78) and score concordance (Kendall coefficient 0.95) among raters, and excellent interrater reliability (interclass correlation coefficient 0.98). The highest percentage of variance in scores among raters was because of rater/item interaction. CONCLUSION: The CTS was developed to efficiently measure key clinical teamwork skills during simulation exercises and in everyday clinical care. It contains 15 questions in 5 clinical teamwork domains (communication, situational awareness, decision-making, role responsibility, and patient friendliness). It is easy to use and has construct validity with median ratings consistently corresponding with the intended teamwork level. The CTS is a brief, straightforward, valid, reliable, and easy-to-use tool to measure key factors in teamwork in simulated and clinical settings.
OBJECTIVE:Human factors and teamwork are major contributors to sentinel events. A major limitation to improving human factors and teamwork is the paucity of objective validated measurement tools. Our goal was to develop a brief tool that could be used to objectively evaluate teamwork in the field during short clinical team simulations and in everyday clinical care. STUDY DESIGN: A pilot validation study. Standardized videos were created demonstrating poor, average, and excellent teamwork among an obstetric team in a common clinical scenario (shoulder dystocia). Three evaluators all trained in Crew Resource Management, and unaware of assigned teamwork level, independently reviewed videos and evaluated teamwork using the Clinical Teamwork Scale (CTS). Statistical analysis included calculation of the Kappa statistic and Kendall coefficient to evaluate agreement and score concordance among raters, and Interclass Correlation Coefficient (ICC) to evaluate interrater reliability. The reliability of the tool was further evaluated by estimating the variance of each component of the tool based on generalizability theory. RESULTS: There was substantial agreement (Kappa 0.78) and score concordance (Kendall coefficient 0.95) among raters, and excellent interrater reliability (interclass correlation coefficient 0.98). The highest percentage of variance in scores among raters was because of rater/item interaction. CONCLUSION: The CTS was developed to efficiently measure key clinical teamwork skills during simulation exercises and in everyday clinical care. It contains 15 questions in 5 clinical teamwork domains (communication, situational awareness, decision-making, role responsibility, and patient friendliness). It is easy to use and has construct validity with median ratings consistently corresponding with the intended teamwork level. The CTS is a brief, straightforward, valid, reliable, and easy-to-use tool to measure key factors in teamwork in simulated and clinical settings.
Authors: Beverly W Henry; Danielle M McCarthy; Anna P Nannicelli; Nicholas P Seivert; John A Vozenilek Journal: Health Expect Date: 2013-10-07 Impact factor: 3.377
Authors: Brenda S Bray; Catrina R Schwartz; Peggy Soule Odegard; Dana P Hammer; Amy L Seybert Journal: Am J Pharm Educ Date: 2011-12-15 Impact factor: 2.047
Authors: Sandra K Oza; Sandrijn van Schaik; Christy K Boscardin; Read Pierce; Edna Miao; Tai Lockspeiser; Darlene Tad-Y; Eva Aagaard; Anda K Kuo Journal: J Grad Med Educ Date: 2018-10