In a retrospective multicenter study, researchers from French teleradiology company Imadis led by Dr. Alexandre Ben Cheikh compared the performance of a commercial AI software application with that of emergency radiologists in thousands of consecutive patients over multiple years. Although the software did not deliver a statistically significant improvement in performance, it did demonstrate its value in low-quality exams and in helping radiologists feel more comfortable in interpreting these CTPA studies.
Example of an output provided by the AI algorithm (AIDOC Medical) to detect pulmonary embolism (PE). A: 64-year-old man presented with spontaneous unilateral pain in the lower limb, increased with palpation, and unilateral edema. The revised simplified Geneva score was 3 and the D-dimer dosage was positive. A contrast-enhanced CT pulmonary angiogram (CTPA) demonstrated a PE in the left lower limb (blue arrow). B: On the same cross-section, the AI algorithm highlighted the same location of the suspected PE through a color-encoded map. All figures courtesy of Dr. Alexandre Ben Cheikh and colleagues and European Radiology.
"Instead of replacing radiologists, AI for PE detection appears to be a safety net in emergency radiology practice due to high sensitivity and [negative predictive value], thereby increasing the self-confidence of radiologists," the authors wrote.
The researchers retrospectively applied a PE detection algorithm from Aidoc to 1,202 patients with suspected PE from September to December 2019. Of these patients, 190 (15.8%) had true PE. They then compared the AI results with the gold standard, which was determined via a retrospective review of the CTPA exam, radiological and clinical reports, AI outputs, and patient outcomes.
|Performance of AI algorithm for detecting PE on CTPA exams
|Positive predictive value
|Negative predictive value
The algorithm identified 19 PEs that were missed by the radiologists -- a rate of approximately one misdiagnosed PE for every 63 CTPA scans. Although AI produced higher sensitivity and negative predictive value than the emergency radiologists in the study, those differences did not reach statistical significance. The higher specificity, positive predictive value, and accuracy for the radiologists was statistically significant, though.
"However, our results were more contrasted for poor-quality examinations and for the appreciation of AI tools by radiologists," the authors wrote. "Indeed, radiologists stressed the importance of AI to strengthen their conclusions, especially to confirm negative findings, or to ensure the absence of distal PE in poor-quality examinations."
In a small cohort of 67 patients with poor-quality exams, the AI algorithm yielded similar accuracy to the radiologists but higher sensitivity and negative predictive value. Although the average interpretation time for CTPA alone increased by one minute and three seconds (7%) after the adoption of AI, 57 (72.2%) of the 79 radiologists who responded to satisfaction surveys deemed the availability of AI to be positive or strongly positive for their diagnostic confidence.
Clinical examples. A 71-year-old patient with a medical history of cancer and recent surgery presented with heart rate > 95 beats per minute and a borderline saturation and underwent a CTPA. The CTPA showed a segmental, sub-acute, PE in the right low limb, which was missed by the emergency radiologist during his on-call duty (red arrow). B: On the same cross-section, the PE was correctly identified by the AI algorithm (AIDOC Medical). Example of PE was correctly diagnosed by the emergency radiologist and not by the AI algorithm. Opposite example: An 85-year-old patient with a medical history of PE and a recent surgery presented with a heart rate between 75 and 94 beats per minute and acute dyspnea and underwent CTPA (C). Two segmental PEs were correctly diagnosed by the emergency radiologist but missed by the AI algorithm (white arrows).
In addition to confirming the high diagnostic performance of AI algorithms for diagnosing PE on CTPA exams, the study demonstrated how AI can best support radiologists, according to the researchers. The algorithm showed particular value in poor-quality examinations and by increasing diagnostic confidence via its high sensitivity and negative predictive value.
"Thus, our work provides more scientific ground for the concept of 'AI-augmented' radiologists instead of supporting the theory of radiologists' replacement by AI," the authors wrote.
Copyright © 2022 AuntMinnieEurope.com