Instead of using one-off AI solutions for different systems and modalities, the Utrecht team built a vendor-neutral infrastructure that supports the entire imaging workflow chain from patient selection, image planning, acquisition, reconstruction, analysis, reporting, and prognosis, according to Dr. Tim Leiner, professor of radiology and chair of cardiovascular imaging at the department of radiology and nuclear medicine at Utrecht UMC. Their pioneering AI system can "listen in" to hospital information streams and then automatically select an appropriate algorithm to be used.
The IMAGR AI infrastructure took three years to build, finally going operational earlier this year, with the first application in January, and routine use since August. With an initial 150,000 euros in funding from Utrecht UMC, the project relied on many volunteer hours. Recently, however, it has received a grant of 620,000 euros from the university.
A neural network on Utrecht UMC's IMAGR AI system reveals white-matter hyperintensity for dementia questions from brain MRI. Image courtesy of Olivier Middendorp - NRC.
"Companies are trying to step into this field by building cloud solutions, but they aren't vendor neutral. As far as we know, Utrecht's IMAGR infrastructure is the first of its kind as an onsite solution for either commercial or in-house algorithms that are integrated with PACS, the RIS, and the hospital information system (HIS)," Leiner told AuntMinnieEurope.com ahead of EuSoMII 2019, where he will be discussing Utrecht's experience in his keynote lecture, How to Bring AI to the Clinic, in the afternoon of Saturday, 19 October.
Radiology departments should carefully think about how they deploy AI, he noted.
"Nobody wants to be locked into the solution of a single vendor. The full potential of AI will only be realized if algorithms are deployed in a vendor-neutral AI infrastructure."
In his presentation on 19 October, Dr. Tim Leiner will detail the advantages of vendor neutral archives and infrastructures for AI. Images courtesy of Dr. Tim Leiner.
IMAGR listens in to information passing through the HIS and RIS and picks out what it needs to know, Leiner explained. For example, when it recognizes a brain MRI, it will select appropriate algorithms that can aid diagnosis and quantification. The infrastructure then retrieves the images for that specific patient from the PACS, including any priors, if necessary, and goes to work. Each so-called pipeline can opt to send resulting images back to the PACS, or present them in a dedicated AI viewer on the PACS station itself -- or both, he noted.
Crucially, IMAGR makes AI fit into and improve workflow. For example, to check for dementia on the brain MRI, the infrastructure automatically starts a pipeline that applies a segmentation algorithm, before applying a white matter hyperintensity detection algorithm. When the radiologist selects the patient's MRI in the PACS, the IMAGR infrastructure automatically follows and shows the corresponding AI results.
Here the IMAGR interface (left monitor) shows the results of the white-matter hyperintensity detection and quantification algorithm. The middle monitor shows the PACS interface. The right monitor shows the electronic health record (EHR). Image courtesy of Olivier Middendorp - NRC.
The department is continuing to feed algorithms into the system, according to Dr. Wouter Veldhuis, associate professor of radiology at Utrecht UMC and initiator of IMAGR, together with Drs. Edwin Bennink and Christian Mol.
Also speaking to AuntMinnieEurope.com before the EuSoMII meeting, Veldhuis pointed to the many algorithms that can run in parallel, such as a lung cancer screening AI program on chest CTs.
Furthermore, in this patient population, sarcopenia assessment -- a labor-intensive and time-consuming task if performed by individuals -- takes just seconds via another neural network on IMAGR that automatically performs a whole-body segmentation to determine fat and muscle content.
"By the time the patient is off the scanner, the radiologist can open the case and read it together with the AI results, because IMAGR has already automatically done the work," Veldhuis said. "Our target is to have ready-to-read reports within 24 hours, which is perfect for outpatients."
The AI in the system can also automatically select spine MRI and convert them into CT images, which brings the advantages of CT for orthopedic presurgery planning, without having to irradiate patients, he noted, adding this was an example of a commercial solution (MRIGuidance) running within the infrastructure. Furthermore, the AI-created CT can then generate and display attenuation correction maps.
Quality and ethics
"With IMAGR up and running we can now deploy any AI algorithm directly into the clinical workflow at Utrecht UMC," Leiner said. "However, we are continuously assessing which ones we want, and which ones we don't, and how deep the level of clinical integration can be."
Dr. Tim Leiner. Image courtesy of Stefan Heijdendael.
Radiologists needed to learn how to deal with different types of AI errors, and know how to present data gleaned from AI to colleagues in order to make it most useful and actionable, Leiner stated.
It is also important to determine what approach to take for deciding when algorithms were good enough and how to measure quality, as well as thoroughly discussing the ethical implications of the results algorithms generate.
Whether imaging departments wait for U.S. Food and Drug Administration (FDA) approval and/or CE certification for algorithms before implementing them remains a moot point. One view is that it is indeed possible, but with careful monitoring, according to Leiner.
Part of the solution lies in understanding an algorithm's generalizability, he noted. For example, if an AI program predicts a 30% risk of a myocardial infarct due to a coronary calcium score, there needs to be clarity on how reliable this prediction is, potential variation, and whether this statistic factors into exceptional cases. He pointed to the need for thorough algorithm testing on large patient databases before deployment.
"I would like to make this presentation a call to action on quality standards to ensure that our neural networks demonstrate reproducibility, applicability, and generalizability in other patient groups," Leiner stated.
This means defining quality standards as an imaging community before large-scale rollout. However, different algorithms need different standards, so this process will require a case-by-case approach, he noted, in tandem with regulatory requirements coming into play.
It is also vital that radiologists who are so far resistant to AI understand its potential power, Leiner continued.
"Those that use AI will outperform those that don't," he said.
Copyright © 2019 AuntMinnieEurope.com