The first essential step before evaluating and purchasing artificial intelligence (AI) is to think seriously about what you need the technology to do, according to AI expert and radiologist Dr. Hugh Harvey. At a recent London workshop, he gave practical advice on how to buy AI.
"Don't just go and buy AI because it's AI. You have to have a problem to solve, and it might be, luckily, that AI is the best way to solve that problem," he said at "AI -- Does your Trust need it?", which was organized by Qure.ai and held during the Intelligent Health AI meeting.
It's vital to make sure the "intended use" of the technology matches the problem, he said. Always ask to see the vendor's intended use statement and instructions for use so that you can find out what the company has actually obtained regulatory approval to do.
"You can catch some companies out," continued Harvey, who is managing director of the clinical digital consultancy firm Hardian Health. "Think about it like a drug. You wouldn't want someone to develop a drug and have it regulatory approved for one disease and then try to sell it to you to cure another disease. That's illegal, and it's the same with AI."
Also, bear in mind that what a vendor says in its marketing -- the nice glossy brochure -- will be phrased in the best possible, most flowery terms, he added.
To find out the particular benefits and risks of AI, ask to see the vendor's own benefit-risk analysis matrix and discuss it with them. On validation, Harvey suggests not being "too clever and data-sciencey if you're not a clever, data-sciencey type of person," and trust the relevant peer-reviewed literature.
Do your own research too, he said -- echoing the advice of fellow workshop speaker Dr. Amrita Kumar, AI clinical lead and consultant radiologist at Frimley Health NHS Foundation Trust, U.K., who believes the central question is: "How do I choose the right solution for my clinical problem?"
Hidden stratifications within data are becoming increasingly important, Harvey commented, emphasizing that not all stratifications are obvious or apparent to the vendor. "Often you don't find biases and errors in algorithms until you run them in your own data and you actually look at some substratifications within that data."
Most available radiology AI systems are classifiers, so the area under the curve (AUC) is important because it measures the ability of a classifier to distinguish between classes, but it's also important to consider confidence intervals, he said.
"If you're thinking about a segmentation tool for radiotherapy delineation, AUC is the wrong metric," Harvey continued. "If you're talking about specific detection -- i.e., pointing to a lesion on an image -- you should look for free recall rock curves, and always ask about the false-positive rates on those."
Usability and integration are also very important, and check with the vendor on the relevant documentation here, he advised.
No single European database of AI currently exists, Harvey explained, but one is being created: the European database on medical devices (EUDAMED). This aims to provide "a living picture of the lifecycle of medical devices that are made available in the European Union," and it will integrate different electronic systems to collate and process information about medical devices and related companies (e.g. manufacturers), according to the European Commission.
He also urges end users to refer to the "evaluating commercial AI solutions in radiology" (ECLAIR) guidelines, which offer an easy-to-understand checklist of the 10 most important questions to ask vendors. Harvey was a co-author of this 2021 document.
Another resource he recommends is the online overview of CE-marked AI software products for clinical radiology based on vendor-supplied product specifications (www.aiforradiology.com), produced by Kicky van Leeuwen and colleagues at Radboud University Medical Center in Nijmegen, the Netherlands.
Also worth checking out is the Manufacturer and User Facility Device Experience (MAUDE) database of reports of adverse events involving medical devices. This searchable database contains 10 years of data.
"There are increasing amounts of AI reports of adverse events and errors," Harvey stated.
From 1 January 2023, the UK Conformity Assessed (UKCA) certification mark will become mandatory. It indicates conformity with the applicable requirements for products sold within Great Britain.