ARTICLE
19 May 2025

Children's Commissioner For England Calls For More Legislation To Stop Child Nudification Apps

LS
Lewis Silkin

Contributor

We have two things at our core: people – both ours and yours - and a focus on creativity, technology and innovation. Whether you are a fast growth start up or a large multinational business, we help you realise the potential in your people and navigate your strategic HR and legal issues, both nationally and internationally. Our award-winning employment team is one of the largest in the UK, with dedicated specialists in all areas of employment law and a track record of leading precedent setting cases on issues of the day. The team’s breadth of expertise is unrivalled and includes HR consultants as well as experts across specialisms including employment, immigration, data, tax and reward, health and safety, reputation management, dispute resolution, corporate and workplace environment.
This week, we cover a rather grim topic - child nudification apps. They are tools that create naked images of children and teenagers, often using existing photos on the internet - effectively deepfakes.
United Kingdom Technology

This week, we cover a rather grim topic - child nudification apps. They are tools that create naked images of children and teenagers, often using existing photos on the internet - effectively deepfakes. However, they can also be generated from more benign apps, such as those allowing you to "try on" different clothes on an avatar. Generative AI, which is often free to use and easy to programme, has supercharged the growth of these tools. Despite this being a relatively new technology, the high risk of harm it presents to children is increasingly evident.

Making such an image is already illegal, and the Children's Commissioner for England has now issued a report calling for the actual tools to be illegal as well. She says that any individual or organisation motivated by the idea of making profit by creating a tool that supports the exploitation of a child must be held to account.

As a result, she has called on the UK government to:

  • Ban nudification apps;
  • Introduce specific legal responsibilities for the companies developing GenAI tools to screen their tools for nudifying risks to children and mitigate them;
  • Provide children with an effective route to have sexually explicit deepfake images of themselves removed from the internet; and
  • Commit to making the online world safer for girls, by recognising sexually explicit deepfake abuse - and bespoke services used to carry this out - as acts of violence against women and girls.

Current and planned law

In early 2025, the UK Government announced it would introduce provisions to protect against the creation of sexually explicit AI content in the Data Use and Access Bill. It also deals with the creation of intimate images of adults in the Crime and Policing Bill. Both Bills are currently passing through the parliamentary process. They place criminal responsibility on individuals, but they will not make the AI models that create this material illegal.

The Online Safety Act 2023 puts legal responsibility on online platforms operating in the UK to prevent UK users from encountering illegal content online, including child sexual abuse material. We wrote about Ofcom's new Children Codes last week, which will come into force in July 2025. They aim to improve protection of children from services designed for adults and from seeing some harmful content and Ofcom has launched an investigation into a nudification service this week. However, the rules don't make providing nudifying services illegal.

What the Children's Commissioner suggests

The Commissioner has made several suggestions for action:

She says that the UK government should legislate to explicitly ban AI tools that are designed or marketed as nudification services. This could be achieved in several ways, such as amending the Product Safety and Metrology Bill to broaden the definition of "products" to include online content and digital replicas to ensure that providers of products or digital products that rely on an AI system carry out risk assessments for illegal and harmful activity and take reasonable steps to design that risk out of the product.

She also calls for the government to introduce an AI Bill in this parliamentary session. Such a Bill would make it the responsibility of providers of GenAI models to prevent their use for nudifying children. It should make it a legal requirement for technology companies to test their products against whether they can be used to nudify children before launching them in the UK. Compliance with this should be outcomes-based - ie that the providers of GenAI models that are used by individuals to nudify children would be held accountable for that.

In the meantime...

In the short term, while the necessary legislation is being prepared, the Commissioner says that Ofcom must fully enforce the Online Safety Act, including ensuring that the safety duties held by Part 5 services are fulfilled effectively. Nudification services, as providers of sexually explicit or pornographic material, are in scope of this. This will mean that nudification services will be required to verify that users are over 18 before allowing them access to content.

Ofcom should also strengthen the Children Codes to ensure its risk assessment process is proactive against emerging harms, including proactive inputs to assess the risk of children encountering harmful content, such as pornographic material.

The government should provide the necessary support for children to report an intimate image, including false ones that have been created using AI, that has been shared in a public online space, and to get it removed. Ofcom should include this in the Illegal Harms Codes.

Finally, the Commissioner proposes that education about these tools should be provided on the school curriculum and that the government calls the use of technology to create nonconsensual sexually explicit deepfakes an act of violence against women and girls, and to commit to ending it in the Government's Tackling Violence Against Women and Girls strategy to be released later this year.

Ofcom has said that it will be consulting on additional measures, including about how to tackle child sex abuse using AI.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More