Aim: The study aimed to evaluate the reliability and usability of the CARE-Radiology checklist in assessing radiological case reports and provide a basis for its broader adoption and optimization.
Methods: Ten randomly selected radiological case reports published in scientific journals in 2020 were evaluated using the CARE-Radiology checklist. Twenty-six experts from 10 countries were invited to independently assess all ten reports. The reliability of the checklist was measured using Fleiss' Kappa, and Cronbach's alpha coefficient. Usability was evaluated by recording the time taken to complete the assessments and requesting the evaluators to rate each item on a Likert scale for its easiness of use.
Results: The median time for evaluating one radiological case report was 15 min. The overall agreement among evaluators showed moderate reliability with a Kappa value of 0.47 and a Cronbach's alpha of 0.51. The mean compliance rate for the items of CARE-Radiology was 61.8%, with some items exceeding 90% compliance. Items related to abstracts and keywords had the lowest compliance rates. The evaluators found most items easy to understand, with a few exceptions.
Conclusions: The CARE-Radiology checklist is relatively easy for researchers to use and understand. Continuous feedback is necessary for future revisions and updates, to enhance the effectiveness of the checklist, and to improve user experience.
Keywords: CARE; case report; radiology; reliability; usability.
© 2024 Chinese Cochrane Center, West China Hospital of Sichuan University and John Wiley & Sons Australia, Ltd.