Joanna Gullberg1, Ayman Al-Okshi2, Dalia Homar Asan1, Anita Zainea1, Daniel Sundh3, Mattias Lorentzon3,4,5, Christina Lindh1. 1. Faculty of Odontology, Malmö University, Malmö, Sweden. 2. Department of Oral Medicine and Radiology, Faculty of Dentistry, Sebha University, Sebha, Libya. 3. Sahlgrenska Osteoporosis Centre, Department of Internal Medicine and Clinical Nutrition, Institute of Medicine, University of Gothenburg, Gothenburg, Sweden. 4. Region Västra Götaland, Department of Geriatric Medicine, Sahlgrenska University Hospital, Mölndal, Sweden. 5. Mary MacKillop Institute for Health Research, Australian Catholic University, Melbourne, VIC, Australia.
Abstract
OBJECTIVES: The purpose of this study was to evaluate rater agreement and the accuracy of a semi-automated software and its fully automated tool for osteoporosis risk assessment in intraoral radiographs. METHODS: A total of 567 intraoral radiographs was selected retrospectively from women aged 75-80 years participating in a large population-based study (SUPERB) based in Gothenburg, Sweden. Five raters assessed participants' risk of osteoporosis in the intraoral radiographs using a semi-automated software. Assessments were repeated after 4 weeks on 121 radiographs (20%) randomly selected from the original 567. Radiographs were also assessed by the softwares' fully automated tool for analysis. RESULTS: Overall interrater agreement for the five raters was 0.37 (95% CI 0.32-0.41), and for the five raters with the fully automated tool included as 'sixth rater' the overall Kappa was 0.34 (0.30-0.38). Intrarater agreement varied from moderate to substantial according to the Landis and Koch interpretation scale. Diagnostic accuracy was calculated in relation to reference standard for osteoporosis diagnosis which is T-score values for spine, total hip and femoral neck and presented in form of sensitivities, specificities, predictive values, likelihood ratios and odds ratios. All raters' mean sensitivity, including the fully automated tool, was 40,4% (range 14,3%-57,6%). Corresponding values for specificity was 69,5% (range 59,7%-90,4%). The diagnostic odds ratios ranged between 1 and 2.7. CONCLUSION: The low diagnostic odds ratio and agreement between raters in osteoporosis risk assessment using the software for analysis of the trabecular pattern in intraoral radiographs shows that more work needs to be done to optimise the automation of trabecular pattern analysis in intraoral radiographs.
OBJECTIVES: The purpose of this study was to evaluate rater agreement and the accuracy of a semi-automated software and its fully automated tool for osteoporosis risk assessment in intraoral radiographs. METHODS: A total of 567 intraoral radiographs was selected retrospectively from women aged 75-80 years participating in a large population-based study (SUPERB) based in Gothenburg, Sweden. Five raters assessed participants' risk of osteoporosis in the intraoral radiographs using a semi-automated software. Assessments were repeated after 4 weeks on 121 radiographs (20%) randomly selected from the original 567. Radiographs were also assessed by the softwares' fully automated tool for analysis. RESULTS: Overall interrater agreement for the five raters was 0.37 (95% CI 0.32-0.41), and for the five raters with the fully automated tool included as 'sixth rater' the overall Kappa was 0.34 (0.30-0.38). Intrarater agreement varied from moderate to substantial according to the Landis and Koch interpretation scale. Diagnostic accuracy was calculated in relation to reference standard for osteoporosis diagnosis which is T-score values for spine, total hip and femoral neck and presented in form of sensitivities, specificities, predictive values, likelihood ratios and odds ratios. All raters' mean sensitivity, including the fully automated tool, was 40,4% (range 14,3%-57,6%). Corresponding values for specificity was 69,5% (range 59,7%-90,4%). The diagnostic odds ratios ranged between 1 and 2.7. CONCLUSION: The low diagnostic odds ratio and agreement between raters in osteoporosis risk assessment using the software for analysis of the trabecular pattern in intraoral radiographs shows that more work needs to be done to optimise the automation of trabecular pattern analysis in intraoral radiographs.
Entities:
Keywords:
Computer-Assisted Image Analysis; Data Accuracy; Dental Digital Radiography; Osteoporosis; Risk assessment
Authors: Anna G Nilsson; Daniel Sundh; Lisa Johansson; Martin Nilsson; Dan Mellström; Robert Rudäng; Michail Zoulakis; Märit Wallander; Anna Darelid; Mattias Lorentzon Journal: J Bone Miner Res Date: 2017-01-18 Impact factor: 6.741