Silvia Bagnera1, Francesca Bisanti2, Claudia Tibaldi3, Massimo Pasquino4, Giulia Berrino5, Roberta Ferraro6, Sebastiano Patania1. 1. SC Ciriè Radiology and Senology SSD, ASL TO4, Via Cotonificio, Strambino, Italy. 2. SC Chivasso Radiology, ASL TO4, Corso G. Ferraris, Chivasso, Italy. 3. SC Radiology Ivrea, ASL TO4, Piazza Credenza, Ivrea, Italy. 4. Department of Health Physics, ASL TO4, Via Natalia Ginzburg, Ivrea, Italy. 5. SC Radiology Ciriè, ASL TO4, Via Battitore, Ciriè, Italy. 6. Senology SSD, ASL TO4, Via Cotonificio, Strambino, Turin, Italy.
Abstract
OBJECTIVES: The purpose of this study is to assess the performance of radiologists using a new software called "COVID-19 score" when performing chest radiography on patients potentially infected by coronavirus disease 2019 (COVID-19) pneumonia. Chest radiography (or chest X-ray, CXR) and CT are important for the imaging diagnosis of the coronavirus pneumonia (COVID-19). CXR mobile devices are efficient during epidemies, because allow to reduce the risk of contagion and are easy to sanitize. MATERIAL AND METHODS: From February-April 2020, 14 radiologists retrospectively evaluated a pool of 312 chest X-ray exams to test a new software function for lung imaging analysis based on radiological features and graded on a three-point scale. This tool automatically generates a cumulative score (0-18). The intra- rater agreement (evaluated with Fleiss's method) and the average time for the compilation of the banner were calculated. RESULTS: Fourteen radiologists evaluated 312 chest radiographs of COVID-19 pneumonia suspected patients (80 males and 38 females) with an average age of 64, 47 years. The inter-rater agreement showed a Fleiss' kappa value of 0.53 and the intra-group agreement varied from Fleiss' Kappa value between 0.49 and 0.59, indicating a moderate agreement (considering as "moderate" ranges 0.4-0.6). The years of work experience were irrelevant. The average time for obtaining the result with the automatic software was between 7 s (e.g., zero COVID-19 score) and 21 s (e.g., with COVID-19 score from 6 to 12). CONCLUSION: The use of automatic software for the generation of a CXR "COVID-19 score" has proven to be simple, fast, and replicable. Implementing this tool with scores weighed on the number of lung pathological areas, a useful parameter for clinical monitoring could be available.
OBJECTIVES: The purpose of this study is to assess the performance of radiologists using a new software called "COVID-19 score" when performing chest radiography on patients potentially infected by coronavirus disease 2019 (COVID-19) pneumonia. Chest radiography (or chest X-ray, CXR) and CT are important for the imaging diagnosis of the coronavirus pneumonia (COVID-19). CXR mobile devices are efficient during epidemies, because allow to reduce the risk of contagion and are easy to sanitize. MATERIAL AND METHODS: From February-April 2020, 14 radiologists retrospectively evaluated a pool of 312 chest X-ray exams to test a new software function for lung imaging analysis based on radiological features and graded on a three-point scale. This tool automatically generates a cumulative score (0-18). The intra- rater agreement (evaluated with Fleiss's method) and the average time for the compilation of the banner were calculated. RESULTS: Fourteen radiologists evaluated 312 chest radiographs of COVID-19 pneumonia suspected patients (80 males and 38 females) with an average age of 64, 47 years. The inter-rater agreement showed a Fleiss' kappa value of 0.53 and the intra-group agreement varied from Fleiss' Kappa value between 0.49 and 0.59, indicating a moderate agreement (considering as "moderate" ranges 0.4-0.6). The years of work experience were irrelevant. The average time for obtaining the result with the automatic software was between 7 s (e.g., zero COVID-19 score) and 21 s (e.g., with COVID-19 score from 6 to 12). CONCLUSION: The use of automatic software for the generation of a CXR "COVID-19 score" has proven to be simple, fast, and replicable. Implementing this tool with scores weighed on the number of lung pathological areas, a useful parameter for clinical monitoring could be available.
Authors: Adam Bernheim; Xueyan Mei; Mingqian Huang; Yang Yang; Zahi A Fayad; Ning Zhang; Kaiyue Diao; Bin Lin; Xiqi Zhu; Kunwei Li; Shaolin Li; Hong Shan; Adam Jacobi; Michael Chung Journal: Radiology Date: 2020-02-20 Impact factor: 11.105
Authors: Geoffrey D Rubin; Christopher J Ryerson; Linda B Haramati; Nicola Sverzellati; Jeffrey P Kanne; Suhail Raoof; Neil W Schluger; Annalisa Volpi; Jae-Joon Yim; Ian B K Martin; Deverick J Anderson; Christina Kong; Talissa Altes; Andrew Bush; Sujal R Desai; Onathan Goldin; Jin Mo Goo; Marc Humbert; Yoshikazu Inoue; Hans-Ulrich Kauczor; Fengming Luo; Peter J Mazzone; Mathias Prokop; Martine Remy-Jardin; Luca Richeldi; Cornelia M Schaefer-Prokop; Noriyuki Tomiyama; Athol U Wells; Ann N Leung Journal: Radiology Date: 2020-04-07 Impact factor: 11.105
Authors: Harrison X Bai; Ben Hsieh; Zeng Xiong; Kasey Halsey; Ji Whae Choi; Thi My Linh Tran; Ian Pan; Lin-Bo Shi; Dong-Cui Wang; Ji Mei; Xiao-Long Jiang; Qiu-Hua Zeng; Thomas K Egglin; Ping-Feng Hu; Saurabh Agarwal; Fang-Fang Xie; Sha Li; Terrance Healey; Michael K Atalay; Wei-Hua Liao Journal: Radiology Date: 2020-03-10 Impact factor: 11.105
Authors: Ho Yuen Frank Wong; Hiu Yin Sonia Lam; Ambrose Ho-Tung Fong; Siu Ting Leung; Thomas Wing-Yan Chin; Christine Shing Yen Lo; Macy Mei-Sze Lui; Jonan Chun Yin Lee; Keith Wan-Hang Chiu; Tom Wai-Hin Chung; Elaine Yuen Phin Lee; Eric Yuk Fai Wan; Ivan Fan Ngai Hung; Tina Poy Wing Lam; Michael D Kuo; Ming-Yen Ng Journal: Radiology Date: 2020-03-27 Impact factor: 11.105
Authors: N Flor; G Casazza; L Saggiante; A P Savoldi; R Vitale; P Villa; F Martucci; E Ballone; A Castelli; A M Brambilla Journal: Clin Radiol Date: 2021-03-31 Impact factor: 2.350