ARTICLE
17 October 2025

Online Safety Update: Regulatory Developments In The UK And The EU

LS
Lewis Silkin

Contributor

We have two things at our core: people – both ours and yours - and a focus on creativity, technology and innovation. Whether you are a fast growth start up or a large multinational business, we help you realise the potential in your people and navigate your strategic HR and legal issues, both nationally and internationally. Our award-winning employment team is one of the largest in the UK, with dedicated specialists in all areas of employment law and a track record of leading precedent setting cases on issues of the day. The team’s breadth of expertise is unrivalled and includes HR consultants as well as experts across specialisms including employment, immigration, data, tax and reward, health and safety, reputation management, dispute resolution, corporate and workplace environment.
Over the past few weeks, the UK government has announced further steps to strengthen protections against harmful online content, while Ofcom has provided updates on its ongoing investigations and enforcement activities...
United Kingdom Technology
Laura Harper’s articles from Lewis Silkin are most popular:
  • within Technology topic(s)
Lewis Silkin are most popular:
  • within Cannabis & Hemp, Tax and Strategy topic(s)

Over the past few weeks, the UK government has announced further steps to strengthen protections against harmful online content, while Ofcom has provided updates on its ongoing investigations and enforcement activities under the Online Safety Act. Meanwhile, the European Commission continues to advance its regulatory framework to protect children and ensure compliance with the Digital Services Act. This article describes the latest developments, highlighting key changes, enforcement actions, and the implications for online platforms and users.

Cyberflashing to become priority offence in the UK

At the end of September, the UK government announced that it will make cyberflashing a "priority offence" under Online Safety Act. This means stricter legal requirements on tech companies to actively prevent unsolicited nude images from being shared on their platforms, rather than reacting to it after the fact. The government suggests that companies could tackle these images by using automated systems that pre-emptively detect and hide the image, implementing moderation tools or stricter content policies.

Cyberflashing became a criminal offence in England and Wales in January 2024, under the Online Safety Act 2023. The law made it illegal to send unsolicited sexual images with intent to cause alarm, distress, or for sexual gratification. Prison sentences of up to two years are possible. Under the Online Safety Act, failing to proactively implement measures to protect users can lead to fines of up to 10% of the companies' qualifying worldwide revenue and potentially blocking their services in the UK.

The regulations will come into force 21 days after they are made, following approval by both Houses of Parliament. The government says that it expects the statutory instrument to be laid this autumn.

Ofcom issues update on Online Safety Act investigations

This week, Ofcom has issued an update about some of the investigations it has been carrying out. Since the Online Safety Act came into force, it has launched five enforcement programmes and opened 21 investigations into the providers of 69 sites and apps. It has now issued an update about 11 of those updates across several themes.

File-sharing services now using automated tech to tackle CSAM following Ofcom enforcement

Ofcom has been assessing measures being taken by file-sharing services, which are frequently exploited by offenders to distribute child sexual abuse material (CSAM) at scale. Ofcom identified serious compliance concerns with two services. They have now deployed perceptual hash-matching technology. This is an automated tool that can detect and swiftly remove CSAM before it spreads further. Ofcom will not be taking further action against these two services.

Information requests should not be ignored

Ofcom routinely issue formal information requests. Firms are required, by law, to respond to all such requests from Ofcom in an accurate, complete and timely way. One key platform has neither responded to a request for a copy of its illegal harms risk assessment, nor a second request relating to its qualifying worldwide revenue. As a result, Ofcom has issued a fine of £20,000. It is also imposing a daily penalty for sixty days or when it is provided with the required information, whichever is sooner. It has also issued two provisional decisions against a file-sharing service and a pornography service provider for similar failures to respond to statutory information requests. It has also provisionally decided that a provider is failing to comply with its duty to put highly effective age checks in place to protect children from encountering pornography. As well as this, it has expanded the scope of its investigation into another pornography service provider to investigate whether it has failed to respond adequately to a statutory information request, which it issued as part of its investigation into their use of age assurance to prevent children from encountering pornographic content.

Monitoring services which take steps to stop UK users from accessing them

In response to Ofcom's enforcement action, providers of some services are now geoblocking their sites to prevent access from the UK. Ofcom has therefore closed those cases. Ofcom also investigated an online suicide forum which has also implemented a geoblock. It also removed messaging from the landing page for UK users that promoted ways to circumvent the block. Ofcom is clear that services who choose to block access by people in the UK must not encourage or promote ways to avoid these restrictions. Ofcom is keeping the forum on its watchlist and its investigation remains open while it checks that the block is maintained and that the forum does not encourage or direct UK users to get around it. Since the requirements to protect children came into force in July, there has been an increase in the numbers of UK users using VPNs and fears that these will circumvent the rules.

The move to make cyberflashing a priority offence will mean that platforms need to consider what proactive measures they are taking to find and remove harmful and illegal content. The update from Ofcom shows the importance of responding to information requests and we are waiting for further updates from Ofcom on their other investigations.

EU developments

As well as the UK developments, the European Commission recently announced that it had issued information requests under the Digital Services Act regarding age verification as well as on how certain large platforms prevent minors from accessing illegal products, including drugs or vapes, or harmful material, such as content promoting eating disorders. The European Board for Digital Services' Working Group for the protection of minors will identify the platforms which pose the greatest risk for children and checking platforms' compliance with the required high level of child safety under the DSA. The Working Group will also develop and share common tools for investigatory and enforcement steps with the aim of ensuring consistency across the EU.

It has also issued a second version of its age verification blueprint which allows for the use of passports and identity cards as onboarding methods, in addition to eIDs to generate a proof of age. Furthermore, it introduces support for what it says is a more user-friendly proof presentation method, the Digital Credentials API, which is increasingly becoming available in modern operating systems and browsers.

What does this mean for me?

These recent developments highlight the increasing expectations placed on digital platforms to proactively protect users, particularly children, from harmful and illegal content. The UK's move to make cyberflashing a priority offence, coupled with Ofcom's robust enforcement actions, may signal a more stringent approach to online harms. At the same time, the EU's ongoing efforts to harmonise child safety standards and improve age verification processes reflect a broader commitment to safeguarding minors across digital environments. As these regulatory frameworks continue to evolve, platforms must remain vigilant and responsive to new requirements.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More