Get AI companies to provide more evidence, Harvey urges

2022 04 27 16 36 0774 Harvey Hugh 20220427161210

Manufacturers of healthcare technology, including artificial intelligence (AI), are "still learning what they can and can't get away with" in terms of the claims they make about the benefits of products, according to AI expert Dr. Hugh Harvey.

Harvey, a U.K. radiologist who now works as managing director of Hardian Health, the clinical digital consultancy firm, had a clear message for would-be purchasers of technology, including AI: "When you hear someone mention a number, a fact or a figure, ask them for the evidence to back it up."

Dr. Hugh Harvey.Dr. Hugh Harvey.

Harvey told attendees at the recent British Institute of Radiology (BIR) annual congress that the AI sector is evolving and still in a "nascent stage," and regulators are working out how to adapt oversight rules. While some vendors would make claims for technology products they couldn't substantiate, "it's still kind of up for debate who is going to regulate and police those claims," he said.

"I think a voluntary sign-up to a code of conduct would be a good start for vendors. But we're still in a bit of a 'Wild West' with the technology -- the truth is no one can police all the claims that people are making," he told BIR attendees.

Harvey sits on an advisory panel with the U.K. regulator, the Medicines and Healthcare products Regulatory Agency (MHRA), which is examining how to ensure better regulatory rigor around medical devices, including software products and AI. There was no separate regulation for AI, he emphasized.

"Algorithms are the new drugs," he said. "Drugs, when they come to market, go through four phases of clinical studies, they go through regulatory approvals, there's a code of conduct on how they can be marketed, sold and commercialised, there are mechanisms for post-market follow up and mechanisms for reporting of adverse events.

All these things apply to software, which is the newest component of medical device regulations, he said.

The complexity of software as a medical device (SaMD) and AI as a medical device (AIaMD) -- and their use in health and social care applications -- has grown in ways that could not have been envisaged when medical device regulations were first framed.

The evolving situation in U.K. post-Brexit

In the U.K., the MHRA launched a consultation late last year on proposed changes to the regulatory framework for medical devices. It seeks to try to better protect patients, support "responsible innovation" in digital health and ensure the regulation of SaMD is clear, effective, and proportionate to the risks such medical devices present.

Harvey said the evidence standards for AI as a medical device were not set out prescriptively in current medical device regulations and there were a number of key regulatory issues to be resolved.

"What happens when you approve a medical device and it's on the market, and it slowly gets changed or adapts over time?" he asked. "This is the first time where we're having to think about how things interact in a human-AI combination."

Harvey told BIR delegates that post-market surveillance and the need to make clinical evidence available for public scrutiny were being strengthened. He said that, following Brexit, the U.K. could consider emulating a more centralized regulatory model like the U.S. FDA and would find it awkward, for political reasons, to simply "copy and paste" European Commission framework regulations, having left the European Union.

But this was not something for people to be worried about, he told delegates in a Q&A session.

"In some ways it's a good thing. I think for the sake of the U.K. health economy, we need to align with somebody, because we can't just stick out on our own in our post-Brexit glory, pretending everything's alright," he said.

Page 1 of 109
Next Page