PACS software benefits from real-world feedback

2013 07 09 12 27 04 471 Dutch Flag 200

Radiologists frequently discover usability issues on workstations during clinical use, so it's important for vendors to continue evaluating their software after deployment, according to new research from the University Medical Center Groningen in the Netherlands.

The Dutch team conducted a usability evaluation of workstation software after it had been placed into clinical use and found that more than 90 issues were identified by the 12 users participating in the study. A predeployment evaluation did not anticipate many of these issues, however.

"Given the limitations of predeployment usability evaluation in radiology, which were confirmed by our finding that the results of a predeployment usability evaluation of this workstation had limited generalizability to clinical practice, it is vital that radiology workstation vendors devote significant resources to usability engineering efforts before deployment of their workstation, and to continue these efforts after the workstation is deployed in a hospital," wrote the team led by Wiard Jorritsma in the January issue of the International Journal of Medical Informatics.

Real-world experience

While performing usability evaluation of software prior to deployment can prevent issues after installation, feedback from real users is also valuable, according to the group. That's especially important in radiology; radiologists often work in different ways. It's also difficult to construct a representative testing environment given the complex nature of radiological workflow and the differences between hospitals and the varying type of connected information systems and imaging modalities in use, according to the group.

Using radiology workstation software that had received a highest usability rating in a previous predeployment evaluation, the researchers set out to assess the usability issues encountered by radiologists during clinical practice. They also wanted to determine how well the predeployment usability evaluation compared with experience from clinical practice (Int J Med Inform., January 2016, Vol. 8:1, pp: 28-35).

Fourteen months after the workstation had been installed at their institution, the researchers performed semistructured interviews and recorded audio/video observations of 12 participants (10 radiologists and two radiology residents) while they were using the workstation during clinical use. They noted usability issues as well as positive findings. For usability issues, the team gave a severity rating and also determined a root cause for the problem.

The workstation in the study consisted of three components: a PACS image viewer, a workflow manager, and a report editor with speech recognition capability. The researchers noted that the vendor had applied usability engineering methods throughout the workstation's development.

Over the course of the study, 92 usability issues were identified, "ranging from issues that cause minor frustration or delay, to issues that cause significant delays, prevent users from completing tasks, or even pose a potential threat to patient safety."

The researchers then met with application specialists of the workstation vendor as well as with users who had received extensive training on the workstation to go over the results. No changes to the severity ratings or any other changes were suggested, however, according to the researchers.

After being placed into five categories: image arrangement/display, image interaction, workflow, report dictation, and miscellaneous, the issues were also categorized by their root cause: interface design, configuration, technical, functional, and external.

Moderately satisfied

With a mean score of 6.7 out of 10 (range: 6-8), the users had a moderate level of satisfaction with the workstation. They noted 15 positive usability findings, which included useful functionalities and interface elements that enabled fast workflow patterns. However, they also cited 92 usability issues, with a mean number of 15.4 per participant (range: 6-22).

The researchers then compared the findings with the predeployment evaluation, which had concluded that users could efficiently and effectively perform image retrieval tasks due to the workstation's effective display protocols. The predeployment evaluation has also determined that users could efficiently perform measurement tasks due to features such as being able to measure in images without having to click on them first and having measurements stay active until another tool is selected.

While the postdeployment evaluation judged the display protocols to be effective in some cases, it also found them to be ineffective in others due to several severe usability issues, according to the researchers.

"None of these issues were due to the implementation of the workstation itself, but were caused by poor configuration of the workstation and external processes," the authors wrote.

Interestingly, keeping the measurement tool active until another tool was selected -- judged to be a benefit in the predeployment evaluation -- was found to be a usability issue during practice. The users were frustrated about having to deactivate the measurement tool after each single measurement and also often accidentally performed measurements when clicking on an image, not realizing that the measurement tool was still active, according to the group.

Need for usability engineering

The results highlight the need for effective usability engineering in radiology, according to the researchers. The authors also noted most of the issues were attributed to interface design or technical problems.

"Addressing these issues requires changes to the implementation of the workstation," they wrote. "A substantial number of issues on the other hand were not due to the implementation of the workstation, but to poor configuration of the workstation in our hospital or external processes that negatively affected the workstation's usability. Addressing these issues would significantly improve the workstation's usability without the need to make any changes to its implementation."

The study results show that predeployment usability evaluation can have limited generalizability to clinical practice, the team concluded.

"This does not mean that they do not provide meaningful insights into the workstation's usability, but they paint an incomplete picture that has to be supplemented with data from actual use in a real-world environment," the authors wrote. "It is therefore important that radiology workstation vendors devote resources to postdeployment usability evaluations, in addition to predeployment evaluations, in order to identify usability issues that slip through the predeployment phase."

In a companion study also published in the January issue of the International Journal of Medical Informatics, the researchers reported that while pattern mining of PACS user interaction log provided some useful insights for a postdeployment usability evaluation, it revealed far fewer usability issues compared with a field study. As a result, it should not be used as the sole method for evaluating usability, they said.

Page 1 of 1245
Next Page