Ernest U Ekpo1,2, Ujong Peter Ujong3, Claudia Mello-Thoms1, Mark F McEntee1. 1. 1 Discipline of Medical Radiation Sciences, Faculty of Health Sciences, University of Sydney, M205, Cumberland Campus, 75 East St, Lidcombe, Sydney, NSW 2141, Australia. 2. 2 Department of Radiography and Radiology, Faculty of Health Sciences, University of Calabar, Calabar, Nigeria. 3. 3 Faculty of Medical Sciences, Cross River University of Technology, Okuku, Nigeria.
Abstract
OBJECTIVE: The objective of the present study was to assess interradiologist agreement regarding mammographic breast density assessment performed using the rating scale outlined in the fifth edition of the BI-RADS atlas of the American College of Radiology. MATERIALS AND METHODS: Breast density assessments of 1000 cases were conducted by five radiologists from the same institution who together had recently undergone retraining in mammographic breast density classification based on the fifth edition of BI-RADS. The readers assigned breast density grades (A-D) on the basis of the BI-RADS classification scheme. Repeat assessment of 100 cases was performed by all readers 1 month after the initial assessment. A weighted kappa was used to calculate intrareader and interreader agreement. RESULTS: Intrareader agreement ranged from a kappa value of 0.86 (95% CI, 0.77-0.93) to 0.89 (95% CI, 0.81-0.95) on a four-category scale (categories A-D) and from 0.89 (95% CI, 0.86-0.92) to 0.94 (95% CI, 0.89-0.97) on a two-category scale (category A-B vs category C-D). Interreader agreement ranged from substantial (κ = 0.76; 95% CI, 0.73-0.78) to almost perfect (κ = 0.87; 95% CI, 0.86-0.89) on a four-category scale, and the overall weighted kappa value was substantial (0.79; 95% CI, 0.78-0.83). Interreader agreement on a two-category scale ranged from a kappa value of 0.85 (95% CI, 0.83-0.86) to 0.91 (95% CI, 0.90-0.92), and the overall weighted kappa was 0.88 (95% CI, 0.87-0.89). CONCLUSION: Overall, with regard to mammographic breast density classification, radiologists had substantial interreader agreement when a four-category scale was used and almost perfect interreader agreement when a dichotomous scale was used.
OBJECTIVE: The objective of the present study was to assess interradiologist agreement regarding mammographic breast density assessment performed using the rating scale outlined in the fifth edition of the BI-RADS atlas of the American College of Radiology. MATERIALS AND METHODS: Breast density assessments of 1000 cases were conducted by five radiologists from the same institution who together had recently undergone retraining in mammographic breast density classification based on the fifth edition of BI-RADS. The readers assigned breast density grades (A-D) on the basis of the BI-RADS classification scheme. Repeat assessment of 100 cases was performed by all readers 1 month after the initial assessment. A weighted kappa was used to calculate intrareader and interreader agreement. RESULTS: Intrareader agreement ranged from a kappa value of 0.86 (95% CI, 0.77-0.93) to 0.89 (95% CI, 0.81-0.95) on a four-category scale (categories A-D) and from 0.89 (95% CI, 0.86-0.92) to 0.94 (95% CI, 0.89-0.97) on a two-category scale (category A-B vs category C-D). Interreader agreement ranged from substantial (κ = 0.76; 95% CI, 0.73-0.78) to almost perfect (κ = 0.87; 95% CI, 0.86-0.89) on a four-category scale, and the overall weighted kappa value was substantial (0.79; 95% CI, 0.78-0.83). Interreader agreement on a two-category scale ranged from a kappa value of 0.85 (95% CI, 0.83-0.86) to 0.91 (95% CI, 0.90-0.92), and the overall weighted kappa was 0.88 (95% CI, 0.87-0.89). CONCLUSION: Overall, with regard to mammographic breast density classification, radiologists had substantial interreader agreement when a four-category scale was used and almost perfect interreader agreement when a dichotomous scale was used.
Keywords:
BI-RADS fifth edition; breast density assessment; interreader agreement; radiologists' agreement
Authors: Clayton P Smith; Stephanie A Harmon; Tristan Barrett; Leonardo K Bittencourt; Yan Mee Law; Haytham Shebel; Julie Y An; Marcin Czarniecki; Sherif Mehralivand; Mehmet Coskun; Bradford J Wood; Peter A Pinto; Joanna H Shih; Peter L Choyke; Baris Turkbey Journal: J Magn Reson Imaging Date: 2018-12-21 Impact factor: 4.813
Authors: Karla Kerlikowske; Christopher G Scott; Amir P Mahmoudzadeh; Lin Ma; Stacey Winham; Matthew R Jensen; Fang Fang Wu; Serghei Malkov; V Shane Pankratz; Steven R Cummings; John A Shepherd; Kathleen R Brandt; Diana L Miglioretti; Celine M Vachon Journal: Ann Intern Med Date: 2018-05-01 Impact factor: 25.391
Authors: Jann Wieler; Nicole Berger; Thomas Frauenfelder; Magda Marcon; Andreas Boss Journal: Medicine (Baltimore) Date: 2021-05-07 Impact factor: 1.889
Authors: Alexander Ciritsis; Cristina Rossi; Ilaria Vittoria De Martini; Matthias Eberhard; Magda Marcon; Anton S Becker; Nicole Berger; Andreas Boss Journal: Br J Radiol Date: 2018-10-01 Impact factor: 3.039
Authors: Natalie J Engmann; Marzieh K Golmakani; Diana L Miglioretti; Brian L Sprague; Karla Kerlikowske Journal: JAMA Oncol Date: 2017-09-01 Impact factor: 31.777
Authors: Lusine Yaghjyan; Ethan Stoll; Karthik Ghosh; Christopher G Scott; Matthew R Jensen; Kathleen R Brandt; Daniel Visscher; Celine M Vachon Journal: Breast Cancer Res Date: 2017-08-29 Impact factor: 6.466