As the use of low-dose chest CT imaging and AI tools increases, the future of chest x-ray and how it is reported is now in the spotlight.
Delegates at Friday’s session, “The changing art of chest x-ray reporting,” heard how the role of chest x-ray will narrow and serve as a gatekeeper for CT. Furthermore, the use of AI for interpretation and reporting will necessitate stricter training and accountability pathways, with the human ultimately responsible for AI misses.
Dr. Thomas Frauenfelder gives an overview of chest x-ray vs. chest CT in Europe.The ESR and Dr. Thomas Frauenfelder
“Chest radiography is the paradox of imaging -- it’s considered basic, and yet I think it remains one of the hardest exams to interpret well,” noted Prof. Annemiek Snoeckx, chair of radiology at Antwerp University Hospital, in her presentation “Who will be the expert teacher if radiologists are no longer the experts?”
She pointed to the two-pronged fear that AI will deskill radiologists: The first concern is that they will stop looking carefully in favor of trusting the tool, and the second is that trainees will not develop deep pattern recognition because the system flags everything, so the “art” disappears. On the flipside, AI could upskill the radiologist by catching subtle disease and reducing fatigue errors, meaning that they spend less time on “normal studies” and more time on abnormal ones.
“Which one happens is not a property of AI, it’s a property of how we implement it and teach with it,” she noted.
Complacency risks
However, a new learning problem is emerging: In 10 years, trainees will be learning in a world where AI is already present. While previously, trainees could detect abnormalities and learn from errors and feedback, in the very near future, they will confirm the AI and move on, meaning that there is a risk of automation bias and excessive trust.
Radiologists can’t be on autopilot in the face of AI annotations, noted Prof. Annemiek Snoeckx. The AI image depicting autopilot complacency was created by Snoeckx with the aid of AI.Image courtesy of Prof. Snoeckx and the ESR.
“Expert oversight and critical appraisal will be crucial for the future. We will be experts in critically supervising a hybrid system, in integrating tools with clinical responsibility, in selecting tools, in combining imaging with context,” she said. “We will need to be experts in communication with referrers, in verification, and in knowing when the tool might be wrong, in building learning loops through audits and feedback. The expert teacher will be the one who teaches this new form of expertise.”
She also flagged the changing role of the x-ray. Twenty years ago, it was meant to make a diagnosis, but chest x-ray with AI increasingly will triage which patients need a CT, which type of CT scan is needed, and the urgency of the exam.
Yet while some centers are moving away from chest x-rays to ultralow-dose CT (ULDCT), she referred to trials comparing health outcomes between chest x-ray and ULDCT for patients suspected of pulmonary disease. The results showed that health outcomes between the two cohorts were comparable. Furthermore, while CT might provide “better” diagnoses, there were also more incidental findings, which did not support routine use.
Legal accountability
Bringing his unique perspective on how AI would shape chest x-ray reporting, Prof. Fergus Gleeson from Oxford University Hospitals National Health Service (NHS) Foundation Trust noted that postmarketing surveillance of algorithm performance by developers didn’t work well. Furthermore, trying to put the legal accountability for failures in the AI interpretation onto the companies was akin to trying to blame social media platforms for individuals posting bad content -- in reality, this rarely stood up in court.
In the discussion, Gleeson noted that it will probably become a legal requirement to register which algorithms are in use in the radiology department, what they are proven to do, and which you are trained to use.
“If you act on an algorithm and something goes wrong, you will be responsible, not the company that developed it,” he said.
In his presentation “Chest radiograph reporting using AI by nonradiologists/radiographers,” he underlined that thorough training was paramount when using AI for reading and reporting chest x-rays.
In terms of chest x-ray reporting with AI, there is little evidence to suggest that radiologists are better than anyone else who is well-trained and does it for a living, Gleeson noted.
“I don’t think the issue is the person doing it,” he said. “I do have an issue with how they are trained and assessed because that seems different from radiologists. But well-trained people, whether from Mars or Venus, seem as good as everybody else,” he said.
However, he drew attention to the unregulated and unsupervised practice of “physician associates” in the U.K., and the reported subsequent patient harm that allegedly has occurred, as featured in an article in The Sunday Times on Thursday, 5 March, “Have physician associates done more harm than good?” by Dr. Phil Whitaker.
“Now let’s remove [the term] ‘physician associates’ and put ‘other people using AI’ -- without a national training body and without a national training exam, audit or quality control, you can see how you can end up with a problem,” Gleeson told delegates.
The session was chaired by Dr. Nigel Howarth, radiologist at Affidea Carouge in Switzerland, with Prof. Dr. Thomas Frauenfelder from Zürich University Hospital, presenting an overview of chest radiograph reporting across Europe.
Our full coverage of ECR 2026 can be found here.

















