Call for Nottinghamshire County Council to keep human “empathy and fairness” during AI services trial
Artificial intelligence technology being used in Nottinghamshire council services should not replace “empathy and fairness” for residents, a committee has warned.
The authority is trialling its use to help streamline work in departments including social care and education.
Nottinghamshire County Council’s overview committee met on January 23, to examine its impact and long-term implications for residents.
The launch of applications including ChatGPT in November 2022 saw the use of artificial intelligence (AI) become more engrained into people’s daily lives.
It has offered people methods of speeding up some daily tasks, such as summarising large documents, writing basic reports and helping to draft emails.
ChatGPT is an example of generative AI, which uses algorithms and models to learn patterns from databases to then produce new content following the patterns.
AI can ‘hallucinate’ which is when it perceives patterns that do not exist and generates incorrect information but presents it as fact.
Council documents state: “Generative AI needs to be deployed cautiously with human oversight and critical thinking to ensure the output is accurate.”
Nottinghamshire was part of a Microsoft pilot scheme in October 2023 which means the county council has been trialling M365 CoPilot, Microsoft’s AI tool embedded in the Microsoft 365 apps.
Up to 300 licenses for this were used by the council initially.
The committee heard that AI tools do not come cheaply and it has been “challenging to identify savings” from those who took up the licenses.
Since then, council staff have been able to test the AI bot for four to six months before being charged, allowing them to assess where their work may benefit from AI assistance.
The council has identified that AI could help save time in the transcription of formal meetings in adult social care and public health and also through the the production of educational health and care plans for children with special educational needs and disabilities.
The authority has already seen benefits in work efficiency during its pilot trial in the summary of meetings and documents — it has also helped with document creation, but assessing time savings has been difficult.
Steve Carr said: “[AI] will improve efficiency, accuracy and accessibility, but not at the cost of empathy and fairness — how to do we ensure that we do it and retain empathy and fairness?”
Paul Martin, head of technology and digital at the council, replied: “The way you provide services, it’s unlikely you’re going to allow it to be autonomous, you’re going to have to some form of human oversight.
“With Education, Health and Care Plans and the provisions from that, there is human oversight that’s required as well.
“Also looking at what’s appropriate [for human oversight] — technology is here, it’s not going anywhere and it’s something we do need to start embracing in how we best use it in the council.”
Kate Foale echoed Mr Carr’s empathy concerns later in the meetings in relation to residents who may be in sensitive circumstances, such as children in care.
Mr Martin responded: “Empathy engines have been written [into AI bots], so if you’re on the phone it can tell you this person’s getting angry, upset, cheerful and it’s ever so accurate.
“At some point that will come, we’re not there — we need to think about the basics like how we support our workforce to do their jobs more efficiently.”
Mr Purdue-Horan welcomed AI use becoming more expansive at the council.
He said: “Over the last six months I’ve seen the effects of human error with the administration of SEND [Special Educational Needs and Disabilities] — this week where an individual was asked to communicate with a staff member of the council who was deceased for two to three months.
“I don’t think we ourselves should be biased towards perfect human activity — there is human error.”