Anna Holdgate1, Natasha Ching, Lara Angonese. 1. Department of Emergency Medicine, Emergency Medicine Research Unit, Liverpool Hospital, Liverpool BC, NSW, Australia. anna.holdgate@swsahs.nsw.gov.au
Abstract
OBJECTIVE: To assess the interrater reliability of the Glasgow Coma Scale (GCS) between nurses and senior doctors in the ED. METHODS: This was a prospective observational study with a convenience sample of patients aged 18 or above who presented with a decreased level of consciousness to a tertiary hospital ED. A senior ED doctor (emergency physicians and trainees) and registered nurse each independently scored the patient's GCS in blinded fashion within 15 min of each other. The data were then analysed to determine interrater reliability using the weighted kappa statistic and the size and directions of differences between paired scores were examined. RESULTS: A total of 108 eligible patients were enrolled, with GCS scores ranging from 3 to 14. Interrater agreement was excellent (weighted kappa > 0.75) for verbal scores and total GCS scores, and intermediate (weighted kappa 0.4-0.75) for motor and eye scores. Total GCS scores differed by more than two points in 10 of the 108 patients. Interrater agreement did not vary substantially across the range of actual numeric GCS scores. CONCLUSIONS: Although the level of agreement for GCS scores was generally high, a significant proportion of patients had GCS scores which differed by two or more points. This degree of disagreement indicates that clinical decisions should not be based solely on single GCS scores.
OBJECTIVE: To assess the interrater reliability of the Glasgow Coma Scale (GCS) between nurses and senior doctors in the ED. METHODS: This was a prospective observational study with a convenience sample of patients aged 18 or above who presented with a decreased level of consciousness to a tertiary hospital ED. A senior ED doctor (emergency physicians and trainees) and registered nurse each independently scored the patient's GCS in blinded fashion within 15 min of each other. The data were then analysed to determine interrater reliability using the weighted kappa statistic and the size and directions of differences between paired scores were examined. RESULTS: A total of 108 eligible patients were enrolled, with GCS scores ranging from 3 to 14. Interrater agreement was excellent (weighted kappa > 0.75) for verbal scores and total GCS scores, and intermediate (weighted kappa 0.4-0.75) for motor and eye scores. Total GCS scores differed by more than two points in 10 of the 108 patients. Interrater agreement did not vary substantially across the range of actual numeric GCS scores. CONCLUSIONS: Although the level of agreement for GCS scores was generally high, a significant proportion of patients had GCS scores which differed by two or more points. This degree of disagreement indicates that clinical decisions should not be based solely on single GCS scores.
Authors: Florence C M Reith; Ruben Van den Brande; Anneliese Synnot; Russell Gruen; Andrew I R Maas Journal: Intensive Care Med Date: 2015-11-12 Impact factor: 17.440
Authors: Latha G Stead; Eelco F M Wijdicks; Anjali Bhagra; Rahul Kashyap; M Fernanda Bellolio; David L Nash; Sailaja Enduri; Raquel Schears; Bamlet William Journal: Neurocrit Care Date: 2008-09-20 Impact factor: 3.210
Authors: Frank J Zadravecz; Linda Tien; Brian J Robertson-Dick; Trevor C Yuen; Nicole M Twu; Matthew M Churpek; Dana P Edelson Journal: J Hosp Med Date: 2015-09-16 Impact factor: 2.960
Authors: E Brooke Lerner; Amy L Drendel; Richard A Falcone; Keith C Weitze; Mohamed K Badawy; Arthur Cooper; Jeremy T Cushman; Patrick C Drayna; David M Gourlay; Matthew P Gray; Manish I Shah; Manish N Shah Journal: J Trauma Acute Care Surg Date: 2015-03 Impact factor: 3.313
Authors: David E Hamilton; Valerie G Press; Nicole M Twu; Trevor C Yuen; Crystal N Azu; Matthew M Churpek; Dana P Edelson Journal: J Hosp Med Date: 2016-02-16 Impact factor: 2.960