Wubi News

Call for ban on AI apps creating naked images of children

2025-04-28 19:00:06

The children's commissioner for England is calling on the government to ban apps which use artificial intelligence (AI) to create sexually explicit images of children.

Dame Rachel de Souza said a total ban was needed on apps which allow "nudification" - where photos of real people are edited by AI to make them appear naked.

She said the government was allowing such apps to "go unchecked with extreme real-world consequences".

A government spokesperson said child sexual abuse material was illegal and that there were plans for further offences for creating, possessing or distributing AI tools designed to create such content.

In February the Internet Watch Foundation (IWF) - a UK-based charity partly funded by tech firms - had confirmed 245 reports of AI-generated child sexual abuse in 2024 compared with 51 in 2023, a 380% increase.

"We know these apps are being abused in schools, and that imagery quickly gets out of control," IWF Interim Chief Executive Derek Ray-Hill said on Monday.

A spokesperson for the Department for Science, Innovation and Technology said creating, possessing or distributing child sexual abuse material, including AI-generated images, is "abhorrent and illegal".

"Under the Online Safety Act platforms of all sizes now have to remove this kind of content, or they could face significant fines," they added.

"The UK is the first country in the world to introduce further AI child sexual abuse offences - making it illegal to possess, create or distribute AI tools designed to generate heinous child sex abuse material."

Dame Rachel also called for the government to:

Paul Whiteman, general secretary of school leaders' union NAHT, said members shared the commissioner's concerns.

He said: "This is an area that urgently needs to be reviewed as the technology risks outpacing the law and education around it."

Media regulator Ofcom published the final version of its Children's Code on Friday, which puts legal requirements on platforms hosting pornography and content encouraging self-harm, suicide or eating disorders, to take more action to prevent access by children.

Websites must introduce beefed-up age checks or face big fines, the regulator said.

Dame Rachel has criticised the code saying it prioritises "business interests of technology companies over children's safety".