As the Committee Stage in the House of Lords begins for the Online Safety Bill (the "Bill") tomorrow, Ofcom has published its planned approach to risk assessments under the Bill.

Although the Bill is still in draft form, it is clear from the second reading in the House of Lords that certain service providers will have a duty to assess the risks posed by illegal content and content that is harmful to children. Subsequently, Ofcom published its planned approach to risk assessments on 14 March 2023 in anticipation of these duties. The announcement outlines the regulator's guidance for organisations expecting to implement new risk assessment procedures and briefly covers what service providers can expect next.

Current duties under the Bill: A recap

All regulated services – defined as user-to-user and search services which have links with the United Kingdom (but do not offer exclusively email and SMS services or other similar exemptions in Schedule 2 of the Bill) – are required by the Bill in its present state to carry out 'suitable and sufficient' risk assessments for the following purposes:

  • to assess the level, nature and severity of risk of all users encountering priority and other kinds of illegal content; and
  • to assess the level, nature and severity of risk of children encountering primary priority, priority and non-designated content which are harmful to children if the service is likely to be accessed by children.

The four-step process

Ofcom acknowledges that the scope of the Bill's regime covers a vast range of services, varying in size, content and other significant, discerning characteristics. For this reason, the regulator has aimed to make its risk assessment guidance applicable to all types and sizes of services. Ofcom's proposed process to conduct risk assessments is as follows:

1. Establish the context

In-scope service providers should establish the risks of harm which they aim to assess. The guidance encourages service providers to consult Ofcom's risk profiles which set out key risk factors and gaps in understanding and evidence to look out for.

2. Assess the risks

In-scope service providers should review the evidence they collect about their platforms and the risks posed. This includes assessing the likelihood of harmful content appearing on the platform and evaluating the impact of potential harm. Service providers should also review their existing mitigating measures in respect of these risks.

3. Decide measures and implement

In-scope service providers should decide on the compliance practices they intend to implement and their methods for enacting these practices, and then record their outcomes.

4. Report, review and update

In-scope service providers should report on the outcomes of risk assessments via relevant governance structures. They should also monitor the effectiveness of mitigation measures and perform regular reviews. Service providers should use this process to recognise any triggers to be confronted between assessment periods.

Next steps and outlook

Whilst the UK's current video-sharing platform ("VSP") regime (which introduced new rules around protecting users from harmful content in November 2020) does not require VSPs to conduct risks assessments, Ofcom strongly recommends that these platforms put in place similar risk assessment and management processes as those required of regulated services under the Bill. The regulator also suggests that organisations proactively maintain compliance with internationally recognised standards for best practice, including the ISO 31000 and the Three Lines Model. These standards embody good risk awareness and prioritisation which should be present across organisations' teams globally.

Ofcom will provide further detail on its guidance when the Bill has received Royal Assent and once the guidance is finalised and published, relevant services are expected to have a period of 3 months in which to complete their first illegal content risk assessment.

The Committee Stage of the Bill is due to start on 19 April 2023 and will involve a detailed line by line examination of the Bill. The Lords are likely to focus on the key themes mentioned at the second reading in February 2023 and refer to our blog post on the second reading here for further information.

The EU Digital Services Act ("DSA"), which came into force in November 2022, also envisages a form of risk assessment being conducted by a subset of in-scope service providers – the larger / higher risk online platforms and search engines (so-called "Very Large Online Platforms" (VLOPs) and "Very Large Online Search Engines" (VLOPEs)). VLOPs and VLOSEs are required to diligently identify, analyse and assess the systemic risks associated with the design or functioning of their service and put in place reasonable, proportionate and effective measures to mitigate the systemic risks identified. The risk assessment conducted should be specific to their services and proportionate to the systemic risks. Categories of risk to consider include, for example, those which have an effect on fundamental rights, civic discourse or electoral processes, public health, minors, physical and mental well-being and dissemination of illegal content. For further information on the EU Digital Services Act please refer to our blog post here.

For those in scope of both the Bill and the DSA, it is also worth considering how any compliance programme (including risk assessments) can be carried out most efficiently – given the differences and potential overlap between the two regimes as well as the disparity in timing between them. Operationally, the most practical solution may require organisations to apply the higher of the two standards for consistency across both jurisdictions.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.