ARTICLE
9 July 2025

Ofcom Consults On Additional Online Safety Measures

LS
Lewis Silkin

Contributor

We have two things at our core: people – both ours and yours - and a focus on creativity, technology and innovation. Whether you are a fast growth start up or a large multinational business, we help you realise the potential in your people and navigate your strategic HR and legal issues, both nationally and internationally. Our award-winning employment team is one of the largest in the UK, with dedicated specialists in all areas of employment law and a track record of leading precedent setting cases on issues of the day. The team’s breadth of expertise is unrivalled and includes HR consultants as well as experts across specialisms including employment, immigration, data, tax and reward, health and safety, reputation management, dispute resolution, corporate and workplace environment.
Ofcom is consulting on proposals for a series of additional safety measures to strengthen Its "first-edition" Codes of practice on Illegal Content and the Protection of Children.
United Kingdom Technology

Ofcom is consulting on proposals for a series of additional safety measures to strengthen Its "first-edition" Codes of practice on Illegal Content and the Protection of Children. The Code of Practice is already in force and the Protection of Children Codes of Practice will come into force on 25 July 2025. The additional measures include wider use of automated technologies to detect harm and additional steps to ensure that services are safer by design. Ofcom says that its proposals take account of the latest developments in harms and technologies, as well as a range of evidence captured through its ongoing engagement with service providers, civil society, law enforcement and members of the public.

Stopping illegal content going viral

Ofcom wants to ensure that platforms are doing more, earlier, to prevent illegal content from spreading. That means having protocols in place to respond to spikes in illegal content during a crisis, like the riots that happened after the Southport attacks last year. Ofcom wants measures to ensure that users are not recommended potentially illegal material until it has been checked. In addition, Ofcom is consulting on better protections around livestreams, with better reporting functions and human moderators available at all times. Finally, it wants to take action against users who share or upload illegal content and content harmful to children.

Tackling harms at source

Huge volumes of content appear online every day, and Ofcom says that as a result, providers need to make better use of technology to prevent illegal material from reaching users. It proposes that providers use hash-matching technology to identify intimate image abuse and terrorism content. It also thinks that some services should go further by assessing the role that automated tools can play in detecting a wider range of content, including child abuse material, fraudulent content and content promoting suicide and self-harm and implementing new technology where it is available and effective.

Affording further protections to children

Ofcom acknowledges that children remain at risk of some of the most egregious harms online. Therefore, it is proposing increased highly effective age assurance to help protect children from grooming. In addition, it is recommending that users should no longer be able to interact with children's livestreams through comments and gifts. Finally, it is proposing that platforms take action to prevent individuals who share child sexual abuse material from using the service.

The consultation ends on 20 October 2025, and you can find details of how to submit a response can be found here.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More