Objectives: To evaluate whether the application of deep learning (DL) could achieve high diagnostic accuracy in differentiating between green colour coding, indicative of tophi, and clumpy artifacts observed in dual-energy computed tomography (DECT) scans.
Methods: A comprehensive analysis of 18 704 regions of interest (ROIs) extracted from green foci in DECT scans obtained from 47 patients with gout and 27 gout-free controls was performed. The ROIs were categorized into three size groups: small, medium, and large. Convolutional neural network (CNN) analysis on a per-lesion basis and support vector machine (SVM) analysis on a per-patient basis were performed. The area under the receiver operating characteristic curve, sensitivity, specificity, positive predictive value, and negative predictive value of the models were compared.
Results: For small ROIs, the sensitivity and specificity of the CNN model were 81.5% and 96.1%, respectively; for medium ROIs, 82.7% and 96.1%, respectively; for large ROIs, 91.8% and 86.9%, respectively. Additionally, the DL algorithm exhibited accuracies of 88.5%, 88.6%, and 91.0% for small, medium, and large ROIs, respectively. In the per-patient analysis, the SVM approach demonstrated a sensitivity of 87.2%, a specificity of 100%, and an accuracy of 91.8% in distinguishing between patients with gout and gout-free controls.
Conclusion: Our study demonstrates the effectiveness of the DL algorithm in differentiating between green colour coding indicative of crystal deposition and clumpy artifacts in DECT scans. With high sensitivity, specificity, and accuracy, the utilization of DL in DECT for diagnosing gout enables precise lesion classification, facilitating early-stage diagnosis and promoting timely intervention approaches.
Keywords: Artificial Intelligence; Computed Tomography; Dual-energy Computed Tomography; Gout Diagnostic Imaging; Sensitivity; Specificity.
© The Author(s) 2024. Published by Oxford University Press on behalf of the British Society for Rheumatology. All rights reserved. For permissions, please email: journals.permissions@oup.com.