While the government cheers the growing use of AI in doctors’ surgeries, it does make them open to potential client negligence claims. 

The government is trumpeting the use of AI in the health service. There were 31.4 million GP appointments in March – up 6.1% on the same period a year earlier and a fifth (19.8%) higher than pre-pandemic – thanks to upgraded technology. 

“For some patients, contacting their practice using online methods can be more convenient and easier to access care for their health needs,” said Amanda Doyle, national director for primary care and community services. 

The government has highlighted Lingwell Croft Surgery in Leeds which uses voice recognition technology as part of its telephone contact system with voice messages automatically and immediately transcribed so patients can be quickly triaged and directed to see the right health and care professional.

Two GPs work alongside two care navigators to triage the patient requests, with GPs prioritising the most complex cases and other patients provided with more appropriate services for their needs, such as one of the health and care professionals working in the practice team, including nurses, physios or paramedics.

AI in healthcare

Improving access

“Improving access to general practice is an NHS priority and GP teams are delivering 30 million appointments every month – up almost a fifth since before the pandemic,” said Bola Owolabi, a GP and NHS England’s director of healthcare inequalities. 

Certainly, the drive to use AI appears to have been accelerating. At the end of last year, Healthcare Today wrote about NHS North East London which is using AI screening technology to identify patients at high risk of needing unplanned emergency care. A three-year programme was launched in collaboration with AI-powered prediction and prevention software Health Navigator and academic health science centre UCLPartners to provide preventative care for patients with long-term conditions.

Forecasting models estimate that the programme will save 26,673 unplanned bed days in North East London hospitals across the three years of the programme, with an anticipated reduction of 13,000 A&E attendances a year.

Potential dangers

But while the cost and time advantages of using AI for services like triage are clear, there are also potential dangers for GP services, chief of which is liability for potential clinical negligence claims that arise from the use of voice recognition software. 

There are also concerns about software understanding different accents in the UK, as well as for those that have English as a second language, and the worry is predominantly that of bias. AI-enabled products can have a high potential for bias due to limitations in the data and processes used to train AI models. 

This has been recognised by the NHS itself. In its new guidance on the use of AI-enabled ambient scribing products in health and care settings, it warns: “NHS organisations may still be liable for any claims arising out of the use of AI products particularly if it concerns a non-delegable duty of care between the practitioner and the patient”. 

To mitigate these dangers, it recommends “clear and comprehensive contracting arrangements” with suppliers that set out their roles, responsibilities and liability appropriately.