ECR: What's the follow-up when radiologists reject AI findings?

Liz Carey Feature Writer Smg 2023 Headshot

Radiologists' agreement or disagreement with AI findings will play a key role in post-market surveillance and vendor collaboration, according to a 5 March discussion at ECR 2026.

Among those leading the session, Dr. Sergey Morozov, PhD, highlighted a pilot project of Prof. Dr. Peter Mildenberger (ECR 2026 Gold Medalist) that aims to implement a feedback protocol into PACS systems to allow radiologists to reject AI when it is really wrong.

"But there should be somebody who will check what's actually rejected," Morozov stressed.

Can a disagreement with AI be used against a radiologist? Some attendees expressed their concerns. Very few radiologists may even raise an issue or report it to a vendor. Regardless, radiologists reluctant to make their disagreement a matter of record, need to realize the bigger picture, explained Dr. Hugh Harvey of Hardian Health.

Feedback trends

"It's not so much the single data points that are important, it's the trends," said Harvey, a radiologist and NHS consultant at Guy’s and St Thomas’ NHS Foundation Trust. "If every radiologist in your department is disagreeing consistently with the AI, then you have an AI problem. Whereas if one radiologist is an outlier and the only one disagreeing the whole time, then maybe you have a problem with that radiologist."

In their daily practice, radiologists may be confronted with a result from an AI system that they consider to be wrong. That feedback and what to do with it presents a problem for vendors. Some vendors make it easy, others do not, but busy radiologists are probably not going to dig through documentation to find an email address to send a single case report, Harvey said.

"I would argue that's not a proportionate postmarket process for what are high-risk devices," Harvey said. "Vendors need to consider how they can enable direct feedback on a case-by-case basis. If that vendor is not able to provide that, they need to be partnering with the deployment platforms, either the PACS vendors or now these new AI marketplace vendors, so they can put in place a process where a radiologist from within their workflow can raise an issue."

The issue might be a simple usability issue, or it might be an adverse event where a patient has come to harm, he noted.

Feedback loops

"What's interesting is if you look at postmarket data that is actually available and published, you'd think that AI is incredibly safe in radiology because very few people are publishing these reports," Harvey said. "But I think we know, and those radiologists who use AI, they know they're seeing maybe anywhere up to 5% to 10% discrepancy rate in their reports. And not all of them matter; a lot are harmless, but they all need to be reported, and they all need to be collected."

Guiding radiology AI implementation is Morozov, now an independent AI consultant based in Belgium and the head of research and development at the 3R Swiss Imaging Network in Sion, Switzerland. During his talk, he noted that there are multiple metrics suitable for dashboards that can be used to govern and manage AI implementation. He described a system of multiple layers designed to control the model and the data, and to monitor and act on the model's performance.

"Potentially, we can implement complex systems which rely on statistical methods to analyze data, methods to analyze concordance, to trigger alerts, involve humans for the review, or even launch automated retraining," Morozov explained. 

Empowered committees

"How to make it sustainable is to ensure that you have AI governance committee which is empowered to make the decisions, which is empowered to monitor the parameters, to analyze the reports, and eventually to make the decisions about decommissioning or starting a new AI project or launching, for example, a new training program or interacting with vendors," Morozov added.

Vendor connection is crucial, Morozov continued, particularly in postmarket surveillance. 

All of this leads to the question of what is the reasonable level of follow-up after the radiologist clicks the "reject AI" button.

Radiology department culture and leadership play a key role in attitudes about contradicting opinions, Morozov added, highlighting the importance of vision and quality assurance and quality management activities.

Our full coverage of ECR 2026 can be found here.

Page 1 of 2
Next Page