UK Tech Firms and Child Safety Agencies to Examine AI's Ability to Generate Exploitation Images

Technology companies and child safety agencies will be granted permission to assess whether AI tools can generate child exploitation material under recently introduced UK legislation.

Significant Increase in AI-Generated Illegal Content

The declaration coincided with revelations from a safety monitoring body showing that cases of AI-generated CSAM have more than doubled in the last twelve months, growing from 199 in 2024 to 426 in 2025.

New Legal Structure

Under the changes, the government will allow approved AI companies and child protection groups to examine AI systems – the underlying technology for conversational AI and image generators – and verify they have adequate safeguards to prevent them from creating images of child exploitation.

"Fundamentally about stopping abuse before it happens," stated Kanishka Narayan, adding: "Experts, under strict conditions, can now identify the risk in AI systems early."

Tackling Regulatory Challenges

The changes have been introduced because it is against the law to create and possess CSAM, meaning that AI creators and others cannot generate such images as part of a testing process. Until now, officials had to delay action until AI-generated CSAM was uploaded online before addressing it.

This legislation is designed to preventing that issue by helping to halt the production of those materials at source.

Legal Structure

The amendments are being added by the government as modifications to the crime and policing bill, which is also establishing a prohibition on possessing, producing or distributing AI models developed to create exploitative content.

Real-World Consequences

This recently, the minister toured the London base of Childline and heard a simulated conversation to advisors involving a account of AI-based exploitation. The call portrayed a teenager seeking help after facing extortion using a sexualised AI-generated image of themselves, created using AI.

"When I hear about young people experiencing blackmail online, it is a cause of intense frustration in me and justified concern amongst parents," he stated.

Concerning Statistics

A prominent internet monitoring organization stated that instances of AI-generated abuse content – such as online pages that may contain numerous images – had significantly increased so far this year.

Cases of the most severe content – the gravest form of exploitation – rose from 2,621 visual files to 3,086.

  • Girls were predominantly targeted, making up 94% of illegal AI images in 2025
  • Portrayals of newborns to toddlers rose from five in 2024 to 92 in 2025

Industry Response

The law change could "constitute a vital step to ensure AI products are secure before they are launched," commented the head of the online safety foundation.

"AI tools have made it so victims can be targeted all over again with just a few clicks, providing offenders the ability to make potentially endless amounts of sophisticated, photorealistic exploitative content," she added. "Material which further exploits survivors' trauma, and renders children, particularly female children, less safe on and off line."

Support Interaction Data

The children's helpline also published information of support interactions where AI has been referenced. AI-related risks discussed in the sessions include:

  • Using AI to rate weight, body and looks
  • Chatbots dissuading young people from consulting safe guardians about harm
  • Being bullied online with AI-generated content
  • Digital blackmail using AI-faked pictures

Between April and September this year, Childline delivered 367 counselling sessions where AI, conversational AI and associated topics were discussed, four times as many as in the same period last year.

Fifty percent of the references of AI in the 2025 sessions were connected with psychological wellbeing and wellness, including utilizing AI assistants for support and AI therapeutic apps.

Patrick Barrett
Patrick Barrett

Elara is a seasoned gaming journalist with a passion for slot mechanics and player advocacy in the UK market.