Much to gain from AI in hybrid imaging, but mind the gap

2022 11 16 15 37 6406 2022 11 16 Irene Buvat Figure 400

Artificial intelligence (AI) can boost image quality and analysis and prediction in hybrid imaging, but clinicians must remain cautious with algorithms because they are not always reliable in clinical practice, a French expert in translational imaging has warned.

Prof. Irène Buvat.Prof. Irène Buvat.

AI keeps growing in nuclear medicine and hybrid imaging, Prof. Irène Buvat, director of the Translational Imaging in Oncology Laboratory U1288 Inserm at the Curie Institute in Paris, told delegates at the Conference on Hybrid Imaging Live (CHILI), organized by the European Society for Hybrid, Molecular, and Translational Imaging (ESHIMT) held on 11 November.

"Now what is new is that we have academic methods and products offered by vendors for these three applications," she said. "This should accelerate the investigation that can be performed either for basic research using molecular imaging and for clinical research."

But the medical community should "mind the gap" because having these AI methods available does not mean they necessarily bring some added clinical value. "Yet having them makes it possible to perform extensive clinical evaluation of these methods, so we can build on that to enhance them," she told attendees at the virtual meeting.

Image quality and interpretation

A very popular application of AI is to recover high-quality images from low-count data, since image quality decreases as the number of counts used for the acquisition is reduced.

"Now we can train algorithms so that they can automatically filter these images and we can recover decent images even from low count acquisitions," Buvat noted.

Practice has shown that image quality increases when images reconstructed from low-count data are filtered by an AI algorithm compared with using a Gaussian filter.

Yet looking at the quantitative content of the images, standardized uptake value (SUV) is not really well recovered when filtering the images with AI. Worse, the underestimation of SUV gets bigger as the lesions get smaller, Buvat pointed out.

"These algorithms don't do a good job at recovering the quantitative values in the filtered images. So this is somehow concerning and we have to pay attention to that, especially when using SUV measurements in patient follow-up," she said.

A successful segmentation by AI of head and neck tumor lesions from PET/CT images. The AI segmentation is almost identical to that of an expert. Courtesy of Thibault Escobar, doctoral candidate, Laboratory of Translational Imaging in Oncology (LITO), Institute Curie, Paris.A successful segmentation by AI of head and neck tumor lesions from PET/CT images. The AI segmentation is almost identical to that of an expert. Courtesy of Thibault Escobar, doctoral candidate, Laboratory of Translational Imaging in Oncology (LITO), Institute Curie, Paris.

Another popular application of AI is the reconstruction of attenuation-corrected PET images without CT. For example, an AI algorithm is trained to recover attenuation-corrected images from the nonattenuation-corrected PET images.

There have been encouraging studies in that area, and AI was shown to be capable of recovering attenuation-corrected images similarly to what would have been obtained when the attenuation correction was performed with CT.

But here again, caution must prevail, she insisted. "In a study on follow-up, the image reconstruction using AI attenuation correction missed a small lesion visible on CT in the first scan and the follow-up scan ... This suggests we have to be quite careful when using AI because it may not always be reliable, and there is currently no way to verify it," she said.

If these algorithms worked, benefits would include faster scanning, with a higher patient throughput and less motion artifacts, and the possibility to perform low-dose scanning in the pediatric population and in patient follow-up using PET and SPECT.

Another major gain could be in novel applications in nuclear medicine, such as a larger use of nuclear imaging in populations like children and possibly pregnant women, added Buvat, who presented the images of a 19-week-old fetus obtained with FDG-PET/MRI. "We can see FDG uptake in the bladder, kidney, and myocardium but not in the fetus' brain," she said.

There has been huge progress in AI-assisted image interpretation, particularly in segmentation of molecular or hybrid images. "Out-of-the-box solutions work quite well," she said. "For example, the no-new-Net offers a well-designed preprocessing, which makes it possible to successfully segment a huge variety of images. This is extremely relevant in the context of hybrid imaging."

The tool enables to automatically segment cancer lesions with results comparable to those obtained by experts, and for whole body tumor segmentation, to automatically segment all tumor sites.

A Viennese team recently managed to segment every organ from a total body PET/CT scan. "This could be interesting to automatically locate tumors or characterize the metabolism in every organ to see how metabolism is actually perturbed by the presence of a disease. That's a very promising area," she said.

If these tools worked, they would ease automated and systematic calculation of biomarkers. "AI could also enable novel applications and considerably facilitate dosimetry study because they make it possible to automatically delineate the organs."

The challenge will be to make sure that these models successfully adapt to external data. Another issue is what Buvat called "the glass ceiling effect."

"The images can only offer what they have, and sometimes their content is not sufficient to answer what we ask. We have to include some clinical, pathological, and possibly genomic data, and we may be able to outperform the performance that we observe in radiomic-only models," she concluded.

Page 1 of 109
Next Page