Artificial intelligence systems are being used to record and summarize conversations between social workers and children in councils across England and Scotland, as reported by the Guardian. But social workers told the outlet that some AI-generated transcripts include "hallucinations," or fake claims, including references to suicidal ideation that were never discussed.
An eight-month study by the Ada Lovelace Institute found that "some potentially harmful misrepresentations of people's experiences are occurring in official care records."
What's happening?
According to the report, dozens of councils, including Croydon as well as Redcar and Cleveland, have given social workers access to AI transcription tools such as Magic Notes and Microsoft Copilot. The Ada Lovelace Institute said in its research that one social worker found the tool had incorrectly documented suicidal ideation even though the client didn't bring it up at all.
Another social worker told researchers that the AI notes sometimes referred to "fishfingers or flies or trees" when a child was describing their parents fighting. Another one tried to redraft their notes with a different tone, but the AI system added "all these words that have not been said." The British Association of Social Workers said it's up to social workers to properly check the notes for errors, and there have been reports of disciplinary action in those cases.
Why is ethical AI-use important?
Although it's used as a tool by adults in this scenario, integrating AI into children's spaces has drawn criticism from some who say it's causing developmental harm and "reshaping" their childhoods.
Children who interact with AI systems without individualized instruction may face developmental and emotional risks. So, when it's integrated into a social service or health care setting, the risks of inaccurate information may greatly affect the quality of care and documentation for these children.
Perk up the winter blues with natural, hemp-derived gummies![]() Camino's hemp-derived gummies naturally support balance and recovery without disrupting your routine, so you can enjoy reliable, consistent dosing without guesswork or habit-forming ingredients. Flavors like sparkling pear for social events and tropical-burst for recovery deliver a sophisticated, elevated taste experience — and orchard peach for balance offers everyday support for managing stress while staying clear-headed and elevated.
Learn more → |
AI errors have also appeared in other sensitive contexts. For example, an AI-generated police report in Utah included a false claim that an officer encountered a frog, which seems harmless but can actually cause serious harm. The more we rely on these tools without human oversight, the more frequent and influential errors may be on children's futures.
What's being done about AI tools in social work?
The research found that these tools can help free up time for social workers to focus more on the client relationship. In terms of regulations, BASW is calling on regulators to issue clear guidance on how and when AI tools should be used, per the Guardian.
Seb Barker, co-founder of Beam, which operates Magic Notes, said the tool produces a first draft and automatically checks for hallucinations, implying that social workers should draft a final after the AI system generates the first draft.
Individuals who want to support ethical AI use in general (since it's also extremely taxing on our resources) can learn how to invest in clean initiatives and learn how the overuse of AI without humans as the forefront can further contribute to critical climate issues.
|
What's the most you'd pay per month to put solar panels on your roof if there was no down payment?
Click your choice to see results and speak your mind. |
Get TCD's free newsletters for easy tips to save more, waste less, and make smarter choices — and earn up to $5,000 toward clean upgrades in TCD's exclusive Rewards Club.









