.
.
Gullberg J, Al-Okshi A, Homar Asan D, Zainea A, Sundh D, Lorentzon M, Lindh C.
.
Dentomaxillofac Radiol. 2021 Jul 29:20210175. doi: 10.1259/dmfr.20210175. Online ahead of print.
Abstract
OBJECTIVES: The purpose of this study was to evaluate rater agreement and the accuracy of a semi-automated software and its fully automated tool for osteoporosis risk assessment in intraoral radiographs. METHODS: A total of 567 intraoral radiographs was selected retrospectively from women aged 75-80 years participating in a large population-based study (SUPERB) based in Gothenburg, Sweden. Five raters assessed participants’ risk of osteoporosis in the intraoral radiographs using a semi-automated software. Assessments were repeated after 4 weeks on 121 radiographs (20%) randomly selected from the original 567. Radiographs were also assessed by the softwares’ fully automated tool for analysis. RESULTS: Overall interrater agreement for the five raters was 0.37 (95% CI 0.32-0.41), and for the five raters with the fully automated tool included as ‘sixth rater’ the overall Kappa was 0.34 (0.30-0.38). Intrarater agreement varied from moderate to substantial according to the Landis and Koch interpretation scale. Diagnostic accuracy was calculated in relation to reference standard for osteoporosis diagnosis which is T-score values for spine, total hip and femoral neck and presented in form of sensitivities, specificities, predictive values, likelihood ratios and odds ratios. All raters’ mean sensitivity, including the fully automated tool, was 40,4% (range 14,3%-57,6%). Corresponding values for specificity was 69,5% (range 59,7%-90,4%). The diagnostic odds ratios ranged between 1 and 2.7. CONCLUSION: The low diagnostic odds ratio and agreement between raters in osteoporosis risk assessment using the software for analysis of the trabecular pattern in intraoral radiographs shows that more work needs to be done to optimise the automation of trabecular pattern analysis in intraoral radiographs.
Keywords: .
Link/DOI: 10.1259/dmfr.20210175