LLMs likely can’t be medical devices (today)
Reading: Are LLM-based ambient scribes and clinical summarisers medical devices?, Dr Hugh Harvey and Mike Pogose, 27 September 2024.
If you have software with a medical purpose, it is subject to regulation. Having a human in a loop is not a ‘get out’:
Plenty of existing AI tools provide outputs for human review and sign off already, as seen in the explosion of radiology and pathology tools, and they are all medical devices.
The article gives insight into the thinking of why an LLM might be a medical device, from the guidance in various geographies.
It reminds me of a short MHRA post from 2023 looking at Large Language Models and software as a medical device:
The MHRA remains open-minded about how best to assure LLMs but any medical device must have evidence that it is safe under normal conditions of use and performs as intended, as well as comply with other applicable requirements of medical device regulation.
They lay out the challenges:
- It’s likely LLMs “will count as software of unknown provenance (SOUP)”, especially I’m thinking if you don’t know the training data.
- It “may prove troublesome” to comply with software quality standards, with these unknowns.
While it may be difficult for LLM-based medical devices to comply with medical device requirements, they are not exempt from them […]