Pattern recognition techniques boost image prefetching

Cloud-based image repositories can yield many benefits, but this archiving model can also suffer from communication latency issues due to high image volumes. A pattern recognition system can really speed up image retrieval times, however, according to researchers from the University of Aveiro in Portugal.

The group applied artificial neural networks to build a pattern recognition system that anticipates what images users will need soon and prefetches them from the image repository. In testing, the approach yielded high enough accuracy for use in prefetching and for replacing images that were already stored in the on-site cache, according to the authors, who shared their findings in an article published online on 5 August in the International Journal of Computer Assisted Radiology and Surgery.

"Initial results show 52-68% reduction in retrieval time when applying the proposed solution in a common cache architecture," noted lead author Carlos Viana-Ferreira of the university's Institute of Electronics and Telematics Engineering.

Latency issues

Thanks to the Internet, PACS can now be deployed in a distributed and interinstitutional manner, facilitating regional archives and remote radiology reporting. This has often involved outsourcing of image storage from on-site archives using, for example, a cloud computing approach. The distributed model also enables broad collaboration between institutions, offering remote access to imaging exams and reports, and easing mobility of patients from one institution to another, according to the researchers.

Naturally, communication latency typically worsens when the archive is no longer on site.

"In fact, communication latency is one of the biggest problems in real-time remote access to medical imaging data, since the studies can reach hundreds of megabytes in some modalities," the authors wrote.

Given this problem, cache mechanisms -- local storage areas that temporarily store studies that are likely to be requested in the short-term -- can be employed as a means of bypassing these latency issues. The success of the cache technique relies, however, on the performance of its prefetching mechanisms.

Traditional prefetching methods in imaging have relied on static rules.

"However, these approaches are not suitable for more dynamic scenarios, resulting in degradation of service quality or even in denial of service, if the cache is filled with unnecessary data," they wrote. "This could arise, for example, from sporadic usages such as a request for all the studies produced within a large time interval, which would fully populate the cache with undesirable data and, therefore, compromise the regular clinical practice."

Pattern recognition

To improve on this situation, the Portuguese group applied artificial neural networks to develop a prefetching mechanism that determines -- based on its knowledge and classification of user behavior -- which images are likely to be needed from the image repository and that should be transferred to the local cache. The team's system also offers intelligence on when to move data out of the cache that is no longer likely to be needed.

"It gives information to the prefetching and cache replacement agents about the most likely set of requests the user will do, allowing those agents to better discriminate the set of objects that will be needed, and those that will not," they stated.

Resistant to user errors, the system continuously adjusts itself to user patterns, software, and institutional workflow, according to the group. It also identifies distinct usage patterns, possibly giving better quality of service to use-case scenarios that have higher priority.

"For instance, if it is predicted that a user is only accessing data for data auditing, the cache will be able to give less priority to that user's requests than to another user that is attending patients," the authors wrote.

In testing, the approach yielded accuracy of 65% on a real-world dataset and 70% on a synthesized dataset, "which is an indicator that it may be used for cache replacement and prefetching," the authors wrote. Retrieval times dropped 52% to 68% when applied in a common cache architecture, demonstrating significant reduction in communication latency.

"We are currently working on a cache replacement and prefetching policies that use the pattern recognition outputs in association with rules like [Least Recently Used] in order to efficiently manage the cache repository capacity," the authors noted.

Page 1 of 1245
Next Page