Should radiologists understand the analytical processes by which computer-aided detection (CAD) software works? Not according to a new study in the Journal of Digital Imaging, which found that readers really want only the CAD results.
In a laboratory test that simulated a clinical radiology reading room environment, an academic research team measured how seven radiology residents reacted to two versions of BoneXpert (Version 1.1.4, Visiana), a CAD program that determines the bone age of children. The study by radiologists and medical informatics specialists at RWTH Aachen University in Aachen, Germany, compared two different types of CAD-PACS integration to compare their effect on usability in the context of radiologists' workflow.
Estimates of skeletal age is required to track endocrine disorders or other pediatric syndromes, as well as for forensic age assessment. BoneXpert suggests bone age based on analysis of a radiograph of the patient's left hand. The CAD software speeds up the manual process, which can take as much as 15 minutes by a radiologist, with automated analysis.
Two variations of the software were integrated into the PACS: the BoneXpert Plugin, which requires manual loading of images into the PACS from a temporary folder on a computer's hard disk, and the BoneXpert Intray, which uses a DICOM protocol to load images for analysis. The radiology residents were asked to evaluate as many hand radiographs as possible in half an hour using the BoneXpert software to provide a second opinion. After a short training session, the residents were assigned to use one or the other software version, and then switch after completing eight analyses.
After completing the cases, the residents were asked to comment about what they were thinking and how they were reacting as they used the software and performed their analyses. These comments were recorded, and their actions observed by the research group. Finally, the residents were asked to complete a detailed questionnaire after the 30-minute evaluation session. Comments, observations, and questionnaire responses were subsequently analyzed in detail with respect to the level of data integration, accessibility of functions and services from the reading workstation, consistent presentation of information and user interface assessment, and integration of content so that a task would only need to be done once in the same workflow.
The residents said they understood the terminology and results produced by BoneXpert and had no difficulties integrating the CAD module in their workflow. They also appreciated unhampered access to the PACS (iSite Version 4.41, Philips Healthcare) and the constant availability of original patient data and radiographs. They also slightly preferred the BoneXpert Intray version.
What they didn't want to see were all the image processing steps that the software displayed as it was performing them. These steps included outlining the bones, epiphyses, and other annotations in different styles and colors. The residents called this process superfluous, although it was intended to provide transparency to radiologists of the image analysis being undertaken and enhance trust in the software. Such visualization activity proved to be distracting and interrupted workflow, according to the residents' comments.
Lead author Dr. Ina Geldermann of the department of medical informatics and colleagues wrote that specifically not wanting to see the software display its image processing visualization steps "may indicate a paradigm shift for medical image analysis." They noted that 10 years ago the opposite was true with respect to CAD processing.
As a result of the research, a new software version of BoneXpert has been developed that performs its image processing functions in the background, the study team wrote. BoneXpert has received CE Mark approval but is not yet cleared by the U.S. Food and Drug Administration for use in the U.S.