The long awaited, and much delayed, Online Safety Bill has finally been published by the Government ahead of its legislative journey. Running to 141 sections and 5 schedules it certainly is a showstopper, though hardly unexpected given the tone and content of the preceding consultative papers and attendant noise. Its objectives: to keep people safe online, tackle illegal and harmful content, and at the same time, to protect freedom of expression. But the Bill is so radical and onerous in parts that it is hard to imagine it surviving its passage intact. Were it to do so, then the Internet as we know it, and social media in particular, could be hard to recognise.

We are all familiar with the key driver for change: the appearance, with depressing regularity of abhorrent racist and abusive content, the exposure of children to cyberbullying and sexual grooming, to name but four examples. The critical policy innovation to remedy this is a departure from the principle that online intermediaries are mere messengers, to fix them with duties and responsibilities of publishers. Quite a challenge when you consider the volume of material which they intermediate.

As expected, the focus of the Bill is on the adequacy of systems and processes, rather than individual content moderation decisions. Extensive duties of care are imposed on regulated services, that is user to user services and search services. The duty is two-pronged: to make the Internet safer but also to safeguard free speech and journalistic content. All overseen by Ofcom which is to be given extensive new regulatory powers to ensure compliance.

To be in scope a regulated service need only have links with the UK - that is be directed to or accessed by users in the UK - and either allow users to share, generate or upload content online which may be encountered by others (U2U services); or are, or include, a search engine. Email and text messaging services are specifically exempt, as are services limited to enabling online reviews or with limited U2U functionality.

Duties of Care

Beyond the general duties of care applying to regulated services, there are additional duties for U2U services relating to, or likely to be accessed by, children, and for higher volume/risk services known as Category 1 services, a register of which is to be maintained by Ofcom. 

The Duties of Care are manifold. Services are:

  • Required to carry out an illegal content risk assessment;
  • Subject to a duty to prevent or remove illegal content;
  • Subject to duties to protect freedom of expression and privacy;
  • And further duties around reporting, redress, record keeping and review.

If the service is likely to be accessed by children:

  • They must carry out a children's risk assessment;
  • Are subject to a duty to protect children's online safety;
  • And additional reporting and redress duties.

In the case of a Category 1 services:

  • They must carry out an adult risk assessment;
  • Are subject to a duty to protect adults' online safety;
  • Subject to duties to protect freedom of expression and privacy;
  • Further duties to protect content of democratic importance and journalistic content;
  • And additional reporting and redress duties.

Illegal Content Duties:

There are two categories of illegal content for these purposes: "ordinary" illegal content which amounts to a criminal offence; and priority illegal content which refers to priority offences to be designated in Regulations.

For ordinary illegal content the duty is to take proportionate steps to mitigate and effectively manage the risk of harm as identified in the illegal content risk assessment.

For priority illegal content it's a duty to operate the service using proportionate systems and processes designed to minimise the presence of such content and swiftly take it down. This is a notably higher duty - though as yet we have no idea as to what offences it would apply to. It's clearly suggestive of automatic content filtering technology. And moreover, surely it potentially offends against the "no general monitoring" principle of the E-Commerce Directive?

Protecting Children

For content harmful to children the duty is threefold:

  • to take proportionate steps to mitigate and effectively manage the risk of harm to children in different age groups, as identified in the children's risk assessment; 
  • to take proportionate steps to mitigate the impact of harm from content that is harmful to children using the service; and
  • a duty to prevent children from encountering harmful content – the duty is blanket in respect of primary priority content (another concept to be defined in regulations) and nuanced in respect of other harmful content, that is according to age and susceptibility.

The definition of this "other" harmful content is not straightforward: Content creating a material risk of having a significant adverse physical or psychological impact on a child of ordinary sensibilities...

There are further provisions aimed at content which is indirectly harmful such as that not targeted at groups rather than a specific individual.

Protecting Adults

Lastly, are the duties in respect of content harmful to adults. This was one of the most controversial aspects of the White Paper since it expressly included content which though harmful, is not illegal. As it turns out, the provisions of the proposed Bill are less onerous than expected, and certainly not comparable with those applying to content harmful to children. As with children there is a twofold definition of harmful content, and priority harmful content which is to be defined in Regulations. The duty is merely:

To spell out in the provider's terms of service its approach to these types of content; and

To ensure that those terms of service are clear, accessible and consistently applied.

This clearly evidences a Government rethink. The contrasts with the provisions applying to child harmful content are significant. One wonders whether in practice this will add anything to existing policies and procedures operated by service providers.

But overall these various duties will have a very significant impact on service providers, not least in financial terms where a Government Impact Assessment has estimated a cost of £1.7bn for the requisite content moderation measures, potentially giving rise to a whole new industry in itself.

Counterbalances

Then there is the well-founded fear that this onerous new regime will encourage service providers to take a safety first approach, thereby potentially curbing freedom of expression. Interestingly the Bill clearly envisages this possibility and includes various safeguards for freedom of expression as a counterbalance.

All services have a general duty to have regard to the importance of the protection of free speech (and privacy rights) when deciding on and implementing safety policies and procedures. For Category 1 Services (see earlier) there is a duty to undertake an assessment of the impact on free speech when deciding on their safety policies and procedures, and to make known the steps used to protect free speech.    

Then there are two further sub-sets of the duty to protect freedom of expression in relation to Content of Democratic Importance and Journalistic Content - these apply only to Cat 1 Services.

The duty to protect content of democratic importance refers to content that is specifically intended to contribute to democratic political debate in the UK. Here the duty is to operate the service using systems and processes designed to ensure that the importance of the free expression of content of democratic importance is taken into account when making decisions about how to treat such content and whether to take action against users sharing such content.

In respect of Journalistic Content, that is UK linked content generated for the purposes of journalism - whether by citizens or journalists - the duty is more onerous: to operate a service using systems and processes designed to ensure that the importance of the free expression of journalistic content is taken into account when making decisions about that content or user. Also, to have in place dedicated and expedited complaints procedures for dealing with take downs or actions against such users.

Commentators have already noted the disparity between the "duty to take account" in respect of freedom of expression content, and the stricter "safety duties" applying to harmful content. The former is a much easier burden to satisfy than the latter which suggests a lack of balance between these competing interests.

Ofcom

Suffice to say that the Bill contains a plethora of measures around compliance and enforcement by Ofcom, and of course in the area of sanctions. Firstly Ofcom is mandated to undertake various risk assessments and draw up various Codes of Practice and Guidance for Service Providers, subject to extensive duties of prior consultation. Ofcom also determines which U2U services fall within Category 1 - the largest and most popular social media sites; Cat 2A - search services; Cat 2B - other U2U services meeting certain threshold conditions. The majority of service providers are expected to fall into Category 2.

Ofcom would also have extensive enforcement powers ranging from the issue of technology, information, penalty and enforcement notices, with attendant sanctions for non-compliance, powers of investigation and interview; potentially massive financial penalties (fines of up to £18m or 10% of worldwide turnover); and service restriction - that is by denying ancillary services such as advertising or card payments - and ultimately access restriction - by ISP blocking.

This would give rise to an enormous expansion of Ofcom's remit and consequently of its resources, to be funded by annual fees levied on service providers.

This is hugely ambitious Bill. It will now be subject to a process of pre-legislative scrutiny before being submitted to Parliament before the end of the year. A long, tortuous and no doubt combative legislative process beckons. Service Providers were primed to assume a duty of care to protect users, but the myriad of attendant duties and responsibilities is bewildering and ultimately will either be fatal to the Bill, or quite possibly to the Internet as we know it.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.