After a quietish summer, Ofcom and the UK government have been busy making announcements about the Online Safety Act, including new rules, new consultations and more investigations.
New rules on self-harm
Earlier this month, the government announced "urgent action" to toughen the Online Safety Act by putting stricter legal requirements on tech companies to search for and remove material that encourages or assists serious self-harm. Although platforms are already required to protect children from self-harm content, the government recognises that adults battling mental health challenges are equally at risk from exposure to material that could trigger a mental health crisis or worse. The new regulations mean that content encouraging or assisting serious self-harm will be treated as a priority offence for all users.
In practical terms, this means platforms must use cutting-edge technology to actively seek out and eliminate this content before it can reach users and cause irreparable harm, rather than simply reacting after someone has already been exposed to it.
The new regulations will come into force 21 days after they are made, following approval by both Houses of Parliament. The government expects to lay them "in the autumn".
Super-complaints
The Online Safety Act and its implementing regulations allow expert organisations representing users or the public to raise a 'super-complaint' with Ofcom as the key regulator. The purpose of super-complaints is to allow such organisations to bring robust evidence and facts to Ofcom's attention about the most significant online harms and restrictions on free expression arising from regulated online services. In the past, organisations have made complaints to bodies such as the CMA and FCA on consumer-related issues. Ofcom is required to publish guidance on super-complaints and is currently consulting on a draft version. Super-complaints should therefore focus on the material risk of significant issues caused by features of sites and apps, or conduct of the organisations that provide them, rather than individual instances of harmful content. Such issues are likely to affect several, many, or possibly all, users of a service or groups of members of the public and have an enduring effect. The consultation ends on 3 November 2023. Organisations can bring super-complaints from 31 December 2025.
Online safety fees
Ofcom has published a third consultation to implement the online safety fees regime in the Online Safety Act 2023 (Fees Notification) Regulations 2025. The Act requires that Ofcom's operating costs for the online safety regime are recovered through fees imposed on providers of regulated services. Providers are required to notify Ofcom in certain circumstances in relation to the fees regime, unless they are exempt. Their notifications must include details of all their regulated services, and (in most cases) details of their qualifying worldwide revenue (QWR). Ofcom has proposed guidance which should help the providers of regulated services when preparing their QWR Returns and navigating the notification process. Specifically, it provides guidance on the practicalities of how and when to notify, and on the details and evidence that should be included in notifications. The consultation ends on 1 October 2025.
Ofcom expects that the duty for providers to notify will come into effect in late Q4 2025, once the Secretary of State has set the qualifying worldwide revenue threshold at (or beyond) which fees become payable (QWR threshold). This event will trigger the opening of a four-month long notification window for the initial 2026/2027 charging year during which providers who are liable for fees will have a duty to notify Ofcom of their QWR.
Retaining information about a child's activity
Ofcom is also consulting on guidance for online platforms that sets out what information they will be required to retain about a child's activity if their death is investigated by a coroner. The Data (Use and Access) Act 2025 grants Ofcom new powers which can help bereaved parents get answers when social media activity is linked to the death of their child. From 30 September, Ofcom will be able to tell tech firms to preserve data about a deceased child's activity on their platform, if requested by the coroner investigating the child's death. It has published proposed guidance on the information tech firms will need to retain in these situations. The consultation ends on 28 October.
Media literacy
Ofcom is consulting on recommendations designed to provide the public with the skills and information needed to critically and safely engage with the content they see. Ofcom recognises that services vary in size and reach, and so its proposed recommendations are designed with proportionality in mind. Although the recommendations are not mandatory, Ofcom considers them to be good practice to help shape services' approach to promoting media literacy. The consultation ends on 8 December 2025.
Investigations
On the investigatory side, Ofcom has published details of further investigations regarding age verification, as well as the failure to respond to statutory information requests. We await the outcome of those investigations and any sanctions applied. If you need any help with your risk assessments or advice about the types of measures you should be taking, please contact the team.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.