- within Technology topic(s)
- with Senior Company Executives, HR and Finance and Tax Executives
- in United States
- with readers working within the Accounting & Consultancy, Pharmaceuticals & BioTech and Law Firm industries
It has been a busy month for online safety developments, and we summarise them for you below.
New priority offence under Online Safety Act
The depiction of strangulation in pornography will be made an offence by the Crime & Policing Bill which is currently going through the parliamentary process. It will be designated as a priority offence under the Online Safety Act. The government says that this means that platforms will be held accountable and will have to make sure that such content does not spread, which can lead to normalising harmful practices in people's private lives. Platforms will be required to take proactive steps to prevent users from seeing illegal strangulation and suffocation content. This could include companies using automated systems to pre-emptively detect and hide the images, moderation tools or stricter content policies to prevent abusive content from circulating. This follows the recent designations of cyberflashing and material that encourages or assists serious self-harm as priority offences.
New powers to scrutinise AI models
As well as this, the government has announced that it will be making another amendment to the Crime & Policing Bill. While possessing and generating child sexual abuse material is already illegal under UK law, both real and synthetically produced by AI, improving AI image and video capabilities present a growing challenge. Therefore, the amendment will permit designated bodies like AI developers and child protection organisations, such as the Internet Watch Foundation, to scrutinise AI models, and ensure safeguards are in place to prevent them generating or proliferating child sexual abuse material, including indecent images and videos of children. Currently, criminal liability to create and possess this material means developers can't carry out safety testing on AI models, and images can only be removed after they have been created and shared online. This measure aims to make sure that AI systems' safeguards can be robustly tested from the start, to limit production in the first place. The laws will also mean that organisations can check that models have protections against extreme pornography and non-consensual intimate images.
Update on suicide forum investigation
Ofcom published an update on its investigation into an online suicide forum under the Online Safety Act. The BBC had also been investigating the forum and said it could be linked to over 50 suicides. On 1 July 2025, the forum implemented a voluntary block to restrict users with UK IP addresses from accessing the service. Ofcom said that it would not be taking further action but that it would be actively monitoring these restrictions to check they are maintained consistently and to make sure the service does not promote or encourage ways for UK users to avoid them. Ofcom has now said that it has reason to believe, from evidence provided to it by the Samaritans on 4 November 2025, that the service is available to UK users. As a consequence, it is now progressing its investigation as a priority, and it aims to reach a conclusion as swiftly as it can. Ofcom says that it will provide further updates on this investigation as soon as possible.
Calls for evidence about age checks and use by children of app stores
As well as this, Ofcom has issued two calls for evidence about the use and effectiveness of age checks by online services, and the use of app stores by children. Ofcom is required to produce separate reports on these two issues under the Online Safety Act. It is therefore gathering views about:
- How providers of regulated services have used age checks to comply with their online safety duties; how effective it has been; and if there are factors that have prevented or hindered the effective use of age checks.
- The role app stores play in children encountering harmful content; how app stores currently use age checks and their effectiveness; and whether greater use of age checks or other safety measures at app store level could improve children's online safety.
The calls for evidence end on 1 December 2025, so pretty soon. Taking responses and other evidence into account, Ofcom will publish and submit a report on the use and effectiveness of age assurance to the UK Government by July 2026, and a report on the use of app stores by children by January 2027.
Lords Select Committee report on repealed regulations
Last month, the government repealed two Regulations that would have brought into force requirements in the Online Safety Act 2023 for user-to-user services to report child sexual exploitation and abuse content to the National Crime Agency. It wasn't clear at the time why the Regulations were being repealed, but the House of Lords Secondary Legislation Scrutiny Committee has now said that the pause in implementation was necessary because of delays in setting up the portal that will enable in-scope companies to submit reports. The Committee expressed its concern, saying that "whilst we understand that technology projects can meet unexpected obstacles, this is regrettable given the possible impact on the detection and reporting of serious crime and on the protection of victims and potential victims". It also criticised the lack of information about why the regulations were repealed, saying that "the explanatory memorandum for any instrument that reverses or pauses a recent policy should, alongside reasons for the change of approach, explain how long the delay will be and what the consequences are of not implementing the policy."
Ofcom fines nudification site £50,000 for failing to introduce age-checks
Ofcom has issued a £50,000 fine against the provider of a nudification site for failing to use age-checks to protect children from online pornography. It has also opened new investigations under its age assurance enforcement programme into five providers which together operate 20 pornography sites. It has prioritised action against these companies based on the risk of harm posed by the services they operate. It has taken particular account of their user numbers, including where it has seen significant increases in their user traffic since age-check laws came into force in the summer. Separately, it is expanding its ongoing investigations into Cyberitic, LLC and the provider of xgroovy.com to decide if they have also failed to adequately respond to Ofcom's formal requests for information. These new cases take the number of sites and apps currently under investigation by Ofcom under the Online Safety Act to 76.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.