The UK government has introduced its Online Safety Bill to Parliament. The new Bill was amended at the last minute to include new online offences and greater obligations on technology providers to do more to prevent "harmful but not illegal" online activity.

Ever since the UK government first published its draft Online Safety Bill (the "Bill") in June 2021, the merits and faults of the proposed new regime have been fiercely debated on a global stage. But between first publication and, now, the first formal step on the legislative journey, the Bill has been strengthened by the addition of a range of further online offences and obligations on digital service providers to do more to prevent online harm.

The Bill will affect a wide range of digital service providers, including search engines and platforms hosting user-generated content (UGC) – but affected companies still don't have necessary clarity and detail about what will actually be expected of them.

Officially introduced to the UK Parliament in March 2022, the Bill imposes a duty of care on certain online providers to take responsibility for the safety of their UK users. This means that digital service providers, regardless of their location, will be affected if they have a significant number of UK users. The Bill is more extensive than contemplated legislation in other countries (even the EU) by including measures targeting the proliferation of "harmful" content – even if that content isn't necessarily illegal.

Overview

We first wrote about the Bill in 2021 when it was first published – see more in our previous client alert.

The Bill will impose certain "duties of care" on two categories of online service providers (i.e., firstly, "user-to-user" (u2u) services and, secondly, search services) to prevent the dissemination of illegal content and activity online. The first category includes companies that allow users to upload and share their own content. Most obviously, this includes global social media companies but, on closer analysis, this category is very broad and may include any provider that hosts UGC. The second category includes companies that enable users to search multiple websites and databases. Affected companies will have to fulfil specific duties in relation to mitigation and risk assessments – all varying depending on the content (illegal or harmful), who is likely to access it (children or adults) and the type of service provider. To meet the duty of care, companies will need to put in place systems, terms of service and processes to ensure user safety.

These new responsibilities will essentially force digital service providers to take a bigger role in policing content, while at the same time asking them to balance their actions against accusations of censorship and prevention of freedom of expression.

The Bill also appoints Ofcom (the existing regulator of the UK communications and broadcasting sector) as the regulatory enforcement authority, with the power to block sites and to levy GDPR-style fines amounting to the higher of £18 million or 10% of global turnover.

New Additions

The original 2021 draft of the Bill was subject to much public scrutiny by bodies, rights groups, affected companies and campaigners. Following this feedback and pre-legislative scrutiny, new amendments have been proposed that, in fact, expand the Bill in a number of ways; some are laudable and some present difficult implementation issues for digital service providers:

  • A new duty requires certain digital service providers to put in place controls in relation to paid-for fraudulent advertising. This follows active campaigning by consumer rights groups. Providers of the largest u2u services will need to prevent users from encountering fraudulent advertising and take down any such adverts of which they become aware. 
  • The UK Parliament will need to vote on the definition of harmful (but not illegal) content. This aims to prevent companies providing regulated services from applying an overzealous approach to moderating content. The Bill defines illegal content (such as terrorist content or child sexual exploitation and abuse (CSEA)). But the types of "legal but harmful content" (which will be much more woolly) will only be defined in secondary legislation and is open to broad definition (and possible governmental expansion). Platforms will need to explain in their terms of service how they deal with this content.
  • In order to combat anonymous trolls, u2u service providers will need to implement identity verification of users, so that there are more user controls aimed at reducing online abuse (e.g., users can choose to be contacted only by verified users).
  • There are new criminal offences for users sending harmful communications (e.g., cyber flashing, threats, harmful and false messages).
  • There will be a duty on pornography sites and other adult content providers to prevent children from accessing or viewing such content (i.e., through age verification and other means).
  • Ofcom will have new powers, including the power to recommend the use of tools for content moderation, user profiling and behaviour identification. Ofcom will also have the power to enter companies' premises to access data and equipment, request information and require companies to undergo external assessments.
  • Personal criminal sanctions can be imposed on senior managers who fail to comply with Ofcom information notices so that they could be prosecuted within two months of the Bill's assent (instead of the two years that was originally proposed).
  • There are obligations to report child sexual exploitation and abuse content.

Digital Service Providers Operating in the UK and EU

The UK's plans are different from the EU's proposals (the very-soon-forthcoming Digital Services Act and Digital Markets Act make up the EU's legislative duo). This is likely to add both confusion and extra regulatory compliance requirements for affected companies – which include non-EU and non-UK digital service providers.

Notably, the Digital Services Act is broader in its scope and aims, capturing many online intermediary services such as network infrastructure. It not only addresses content but the traceability of business users, access for researchers to data, personal data use and malfunctioning of services. However, the Bill goes beyond the DSA in including measures to address harmful content. Although both regimes address illegal content, the definition of this could vary from Member State to Member State, adding further complexity.

What's Next?

The Bill will be read in both Houses of Parliament before it gains Royal Assent and so may be subject to further revisions. However, later regulations will determine whether digital service providers are regulated by the Bill depending on their size and functionalities – e.g., What is a "significant" number of users in the UK? What exactly is harmful content?

Ofcom will also have to draft and publish the mandatory codes of practice – and these factors are likely to significantly delay the Bill's coming into force.

Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Morrison & Foerster LLP. All rights reserved