AIM: To compare levels of agreement amongst paediatric clinicians with those amongst consultant paediatric radiologists when interpreting chest radiographs (CXRs). MATERIALS AND METHODS: Four paediatric radiologists used picture archiving and communication system (PACS) workstations to evaluate the presence of five radiological features of infection, independently in each of 30 CXRs. The radiographs were obtained over 1 year (2008) from children with fever and signs of respiratory distress, aged 6 months to <16 years. The same CXRs were interpreted a second time by the paediatric radiologists and by 21 clinicians with varying experience levels, using the Web 1000 viewing system and a projector. Intra- and interobserver agreement within groups, split by grade and specialty, were analysed using free-marginal multi-rater kappa. RESULTS: Normal CXRs were identified consistently amongst all 25 participants. The four paediatric radiologists showed high levels of intraobserver agreement between methods (kappa scores between 0.53 and 1.00) and interobserver agreement for each method (kappa scores between 0.67 and 0.96 for PACS assessment). The 21 clinicians showed varying levels of agreement from 0.21 to 0.89. CONCLUSION: Paediatric radiologists showed high levels of agreement for all features. In general, the clinicians had lower levels of agreement than the radiologists. This study highlights the need for improved training in interpreting CXRs for clinicians and the timely reporting of CXRs by radiologists to allow appropriate patient management.
AIM: To compare levels of agreement amongst paediatric clinicians with those amongst consultant paediatric radiologists when interpreting chest radiographs (CXRs). MATERIALS AND METHODS: Four paediatric radiologists used picture archiving and communication system (PACS) workstations to evaluate the presence of five radiological features of infection, independently in each of 30 CXRs. The radiographs were obtained over 1 year (2008) from children with fever and signs of respiratory distress, aged 6 months to <16 years. The same CXRs were interpreted a second time by the paediatric radiologists and by 21 clinicians with varying experience levels, using the Web 1000 viewing system and a projector. Intra- and interobserver agreement within groups, split by grade and specialty, were analysed using free-marginal multi-rater kappa. RESULTS: Normal CXRs were identified consistently amongst all 25 participants. The four paediatric radiologists showed high levels of intraobserver agreement between methods (kappa scores between 0.53 and 1.00) and interobserver agreement for each method (kappa scores between 0.67 and 0.96 for PACS assessment). The 21 clinicians showed varying levels of agreement from 0.21 to 0.89. CONCLUSION: Paediatric radiologists showed high levels of agreement for all features. In general, the clinicians had lower levels of agreement than the radiologists. This study highlights the need for improved training in interpreting CXRs for clinicians and the timely reporting of CXRs by radiologists to allow appropriate patient management.
Authors: Clarissa Valim; Rushdy Ahmad; Miguel Lanaspa; Yan Tan; Sozinho Acácio; Michael A Gillette; Katherine D Almendinger; Danny A Milner; Lola Madrid; Karell Pellé; Jaroslaw Harezlak; Jacob Silterra; Pedro L Alonso; Steven A Carr; Jill P Mesirov; Dyann F Wirth; Roger C Wiegand; Quique Bassat Journal: Am J Respir Crit Care Med Date: 2016-02-15 Impact factor: 21.405
Authors: Emma Taylor; Kathryn Haven; Peter Reed; Ange Bissielo; Dave Harvey; Colin McArthur; Cameron Bringans; Simone Freundlich; R Joan H Ingram; David Perry; Francessa Wilson; David Milne; Lucy Modahl; Q Sue Huang; Diane Gross; Marc-Alain Widdowson; Cameron C Grant Journal: BMC Med Imaging Date: 2015-12-29 Impact factor: 1.930