One in five GPs is using generative AI tools to help them do their job, despite the risks that come with it, and a lack of training.
GPs are using commercially available tools like ChatGPT to write documentation and even suggest alternative diagnoses for patients, a new study in the British Medical Journal shows.
It’s the largest study of its kind and shows for the first time the extent of AI use in the UK’s GP surgeries.
The study’s author, Dr Charlotte Blease, associate professor at Sweden’s Uppsala University, called the extent of AI use by doctors “surprising” because “doctors haven’t received formal training on these tools and they’re still very much a regulatory black hole”.
Tools like ChatGPT have known issues that could be harmful for patients, including their tendency to “hallucinate”, or make things up.
“Perhaps the biggest risk is with patient privacy,” said Dr Blease. “Our doctors may unintentionally be gifting patients’ highly sensitive information to these tech companies.”
“They may [also] embed biases within care so some patients may be at risk of unfair clinical judgements.
“We don’t know if [AI’s biases] are worse than what arises in ordinary human health care, but there certainly is a risk of bias.”
The team surveyed over 1,000 doctors and of the one in five who said they do use generative AI in their work, 28% said they use it to suggest different diagnoses for their patients.
Another 29% said they use AI to generate documentation after patient appointments.
The majority of NHS staff support the use of AI to help with patient care, with 81% also in favour of its use for administrative tasks, a study by the Health Foundation found in July.
The NHS doesn’t offer much guidance for doctors around how they should use AI, despite healthcare professionals wanting to use it more.
Instead, it asks them to use their “professional judgement” when working with the technology.
Dr Blease, who is also the author of a book on how AI can be used in healthcare, said doctors are “crying out for some concrete advice” on how to use the technology.
“There does need to be targeted training and advice being offered to doctors,” she said.