Machine learning can have an impact on mammographic interpretation, noted Dr. Anton Becker, a radiology registrar at the University Hospital of Zürich. Asked what motivated the increasing interest in developing neural networks to aid reporting and interpreting scans in radiology, he pointed out that radiology workload was increasing, particularly for emergency radiology and mammography. Despite the growing workload, the number of radiologists has not increased proportionately.
Dr. Anton Becker from Zürich.
"Emotionally, as a radiologist, if you miss something, it is a devastating experience because you feel for the patient," Becker said. "Neural networks might help in this respect as a safety net."
The number of false positives generated by mammography continues to cause concern for clinicians and patients alike. Aside from the clinical importance, mammography data are easy to feed into a neural network, and deep learning performs well here because mammograms are 2D projection images with a very high resolution, providing many pixels to work with, according to Becker.
"Even experienced radiologists perform with an area under the curve below 90%. Also, while false negatives are obviously a disaster for the patient, we know that with false positives we aren't doing them a favor either. Even after the cancer is ruled out, these patients have a higher incidence of sleep disturbance, depressive disorders, and so on," he said, highlighting the need to be as specific as possible in making a diagnosis.
Artificial intelligence (AI) is an overarching term, and with respect to radiology, current AI procedures are more akin to very complex regression analysis, Becker continued. In the near future, AI-related products used in medical imaging are likely to be a narrow form of AI that applies to a specific area such as mammography.
"It will not be comprehensive or truly intelligent, at least not for a long time," he remarked, adding that machine learning is a more appropriate term. "We provide the machine with the algorithm, which it uses to learn, and the type of algorithm depends on the problem under investigation."
Neural networks are one subdivision of machine learning, and those currently in use are usually convolutional, deep neural networks that perform both the feature extraction and the evaluation, said Becker, noting that today's convolutional networks consist of a variable number of layers. In the upper layers, the features are simple (differences in black and white, edge detection), and in the lower layers, these patterns are combined and become more complex -- for example, microcalcifications or spiculated borders. Lastly, the network will compress and reduce the features to the most important ones for the problem at hand.
Reflecting on the current capability of these networks, Becker said that despite their appeal right now, the algorithms may change in the future.
"I think another reason, aside from the good performance, for their appeal is because they have a biological 'feel' to them. But actually, during training, information is sent backward through the network, which is the opposite to a human neuron that only sends information in a forward direction," he noted.
Will AI replace radiologists?
Becker does not believe that AI will replace radiologists, and he sees a clear role for human-human interaction in interpreting and communicating scan results.
"I don't think the replacement will come soon, if ever, because people still want someone to interact with them and explain their results, especially in mammography," he said.
He emphasized that the role of AI, at least in the immediate future, is to support radiology because, in the real world, every patient is different.
"Finding false positives provides a good example of why the person-to-person contact is important. We need to explain to the patient that there might be something small that is probably insignificant but needs following up and so on," he told AuntMinnieEurope.com in an interview. "AI could improve on this situation because there are many lesions where we just cannot confidently rule out cancer. Having to biopsy those lesions is unpleasant, so it might help avoid some of those unnecessary procedures."
According to some observers, by just having a human brain, you are anatomically restricted to thinking certain thoughts and perceiving certain patterns, whereas with an artificial brain, it should be possible to exploit new patterns and thoughts that have previously been "unthinkable: because this anatomical limitation does not apply anymore, Becker explained. Along these lines, it is thought that using these neural networks in radiology means it is possible to see patterns in images that humans cannot see.
"This might be true -- but it is also true that you cannot find something that is not in the picture, and every radiologist is aware of this," he said. "Every modality has a physical limit, so, for example, a small invasive lobular carcinoma in a dense breast will not be picked up because it is overlaid by normal tissue. You don't find it because the background signal is so much stronger. There's a limit to everything."
Becker and his colleagues have conducted trials on the experimental deep neural network against two experienced radiologists and a medical student, the latter of whom had very little training in radiology. They found that the deep-learning software was slightly more sensitive, despite not being designed specifically for mammograms or breast ultrasound, but the radiologists were more specific.
The experience with the medical student reflected exactly how the neural network starts out, without prior knowledge in breast imaging or any other significant information on the patient, Becker said. The medical student performed slightly worse than the neural network and took just under an hour to learn from the images versus a few minutes for the neural network.
Limitations of neural networks
The limitations of using neural networks, or machine learning in general, include an inability to spot outliers never seen in the training data.
"Where other factors enter the analysis such as patient history, age of patient, or gene mutation carriers, these would limit the outcome with a neural network," said Becker, adding that other limitations relate to the signal-to-noise ratio of the image, with "certain things just drowning in the noise and even an experienced radiologist would not be able to distinguish that."
Neural networks are most likely to be used in areas of medicine, which are narrow and have a specific question to answer -- for example, brain bleeds, prostate cancer, and lung cancer.
"We've also found that neural networks perform well in detecting signs of active tuberculosis in chest x-ray examinations, a huge problem in the developing world," he said.
Becker concluded by returning to the ever-present fear that radiologists somehow would be replaced by AI. He made an analogy with flying a plane.
"Airplanes for a long time have been able to fly themselves, yet you still want a pilot there to be ultimately responsible for everything. This might be the case in radiology at some point -- even if AI is capable enough, you will want a radiologist in the driver's seat."
Editor's note: Dr. Becker has asked us to draw the attention of readers to the following citation: Deep Learning in Mammography: Diagnostic Accuracy of a Multipurpose Image Analysis Software in the Detection of Breast Cancer. Investigative radiology, July 2017, Vol. 52:7, pp. 434-440.
Copyright © 2017 AuntMinnieEurope.com