All eyes are on Europe in the race to regulate the digital ecosystem. The Digital Services Act ("DSA") was signed into law on 19 October 2022 by the President of the European Parliament and the President of the Council, passing its last hurdle on the legislative journey to implementation. It is expected to come into force in the coming weeks. The DSA reshapes the accountability of digital service providers, marking one of the European Union's most significant updates to its legal framework for digital regulation since the adoption of the e-Commerce Directive in 2000.
This comes as, last week, the senior coroner in the UK's Molly Russell inquest issued online safety recommendations that focus on child access to social media content. The recommendations urge the government to review the provision of internet platforms to children and consider separate sites for children and adults. It also follows indications that the DSA's UK equivalent, the Online Safety Bill ("OSB"), will be subject to potential scope changes to further protect children and freedom of speech. The OSB has been met with delays since the summer, and with an additional cabinet re-shuffle likely under the helm of new prime minister Rishi Sunak, we will be closely monitoring the trajectory of the Bill under any new Secretary of State for the Department for Digital, Culture, Media and Sport.
Speedread – key takeaways:
- The DSA is an ambitious package seeking to combat online illegal content and regulate the online environment in the EU. It is backed by fines of up to 6% of the worldwide annual income or turnover of an infringing service provider.
- The package aims to harmonise the current fragmented patchwork of national measures covering digital regulation across the EU.
- The DSA is broad in scope – applying to a range of key players across the digital ecosystem, with in-scope service providers categorised as one of (i) intermediary services, (ii) hosting services, (iii) online platforms bringing together sellers and consumers, or (iv) very large online platforms / very large online search engines ("VLOPs" / "VLOSEs").
- The legislation incorporates a risk-based asymmetric approach to the obligations, with VLOPs and VLOSEs being subject to enhanced regulation. These include assessing the systemic risks associated with their service and putting in place measures to mitigate those risks.
- It has extra-territorial effect, and therefore applies not only to EU-based providers, but also those based outside the EU that offer services within the EU.
- The DSA confirms there is still no general obligation to monitor the information which intermediary service providers transmit or store, nor actively seek facts indicating illegal activity. It also preserves the so-called "liability shield" introduced under the eCommerce Directive and extends it further to intermediary service providers who decide to voluntarily implement measures to detect and remove illegal content.
- Other key areas covered include:
- a user-friendly notice and action mechanism for hosting service providers, allowing third parties to notify the platform of illegal content on their services. Plus, an internal complaint handling procedure managed by online platforms as one of the redress options;
- regulations requiring greater transparency around advertising, algorithmic decision making and recommender systems, in particular. This aligns with a growing consumer protection "trend" of transparency under other EU legislation, such as the GDPR and the proposed AI Act;
- a prohibition on use of "dark patterns";
- provisions for online platform providers that allow consumers to conclude distance contracts with traders, including requirements to help identify and track down sellers of illegal goods;
- a crisis response mechanism to address "extraordinary circumstances affecting public security or public health"; and
- a new pan-EU compliance and enforcement regime involving new Digital Services Coordinators at Member State level, a new European Board for Digital Services, and enhanced supervision and enforcement by the Commission for VLOPs or VLOSEs.
- The DSA is likely to enter into force around November 2022. Whilst the majority of the provisions will apply 15 months later (from early 2024), certain transparency reporting requirements will apply immediately with an expected deadline of February 2023 (to help determine whether online platforms are considered VLOPs or VLOSEs).
- The provisions relating to VLOPs or VLOSEs will also apply within four months of being notified as such by the Commission (if earlier than the application of the DSA as a whole). Given the additional obligations placed on VLOPs and VLOSEs and the potential for an earlier application date, the task at hand is a sizable one.
- In due course the Commission and European Board for Digital Services are expected to facilitate voluntary standards in a range of different areas and voluntary codes of conduct at EU level to enable the "proper application of the Regulation" taking into account the challenges of addressing different types of illegal content and systemic risks. These voluntary standards and codes of conduct will be welcomed by in-scope service providers to assist with compliance and interpretation of the legislation.
- The UK's proposed OSB is currently drafted to regulate a wider scope of content than the DSA, regulating "legal but harmful" content as well as illegal content. Now that the DSA has been finalised, it will be interesting to see whether any elements of the OSB are altered to align more closely with the EU's approach.
- For those in scope of both the UK and EU legislative packages, it is also worth considering how any compliance programme can be carried out most efficiently – given the differences and potential overlap between the two regimes as well as the disparity in timing between them. Operationally, the most practical solution may require organisations to apply the higher of the two standards for consistency across both jurisdictions.
Background
The DSA was originally announced by Ursula von der Leyen in her political guidelines in July 2019, and forms part of a legislative package for regulating the online environment in the EU and beyond. It is an element of the European Digital Strategy "Shaping Europe's digital future", and was subject to public consultation from June to September 2020. You can find out more about the public consultation in our post here.
Up until now, the European Commission has sought to regulate digital services through a series of incremental and complementary legislative initiatives, aimed at reinforcing and building on some of the key principles of the e-Commerce Directive adopted in the year 2000 (for example the liability rules for providers of intermediary services). The DSA aims to create a safer online environment, define clear responsibilities for digital services providers, and deal with current digital challenges. The European Council has described the DSA as a "world first in the field of digital regulation", and it has been touted by the Czech minister for industry and trade as having the potential to become the "gold standard for other regulators in the world".
In October 2020 the European Parliament overwhelmingly approved two associated legislative initiative reports, recommending to the European Commission that the DSA should also contain tougher regulation on targeted advertising and the management of illegal content, as well as powers for a new regulatory authority to oversee content hosting platforms. You can read more about those recommendations at our post here and further background is available here.
A consistent aspect of this legislative package since its inception has been the focus on those major technology companies and online platforms who are perceived to be the "gatekeepers" of the online world and the desire to introduce new "ex-ante" rules to foster competition and ensure that these companies treat their B2B customers fairly. Possibly in recognition of the fact that these competition issues are slightly separate from the more consumer-facing issues which the DSA seeks to address, the European Commission created a separate dedicated piece of legislation known as the Digital Markets Act ("DMA") to address these concerns and form the second pillar of the new package. For further information regarding the DMA please refer to our blog post here.
Ministerial policy negotiations on aspects of the DSA took place throughout 2021, culminating in the Member States' unanimous agreement on the 'general approach' of the European Council. The DSA was then approved by the European Parliament in April 2022, and earlier this month received final approval from the European Council. It was signed into law on 19 October 2022 and precise timing of it coming into force will depend on when it is published in the EU Official Journal.
An Implementing Regulation on the DSA is expected in the fourth quarter of 2022, following a consultation process. It will lay down rules on all procedural practical arrangements in Article 83 of DSA, namely the exercise of delegated acts conferred on the Commission, hearings for VLOP or VLOESs relating to a non-compliance decision / fines or penalties and disclosure of certain information at those hearings.
The headlines: a deeper dive
The DSA is complex and wide ranging. We highlight below some of the key headlines from the legislation, focussing on those that have attracted the most scrutiny from stakeholders:
- Broad scope and tiered obligations
The DSA is broad in reach and is intended to apply to a range of key players across the digital ecosystem. The DSA breaks these digital service providers down into four categories, each of which is a narrower subsection of the category before: (i) online intermediaries; (ii) hosting services (such as cloud and webhosting services); (iii) online platforms (bringing together sellers and consumers and disseminating information to the public at their request, such as online marketplaces, app stores, collaborative economy platforms and social media platforms); and (iv) 'Very large platforms'. Some of the new measures apply to all four of these categories, and others apply only to some of them, with the narrowest, highest tier category ('Very large platforms') being subject to the most stringent regulations under the DSA (see below).
The DSA includes some exemptions, for example in relation to transparency reporting, for micro/small entities, defined as those with fewer than 50 employees and less than €10 million in annual sales (but only if their reach and impact does not meet the criteria to qualify as a very large online platform (see below)). These exemptions are designed to ensure start-ups are not unduly burdened and are able to emerge, scale up and offer effective competition to the larger platforms.
- Very large platforms: more stringent regulation
Due to larger platforms having a greater reach and therefore a greater impact on influencing how users obtain information and communicate online, the obligations imposed by the DSA have been designed asymmetrically. This means VLOPs and VLOSEs are subject to enhanced controls and responsibilities. VLOPs and VLOSEs are currently defined as those with more than 45 million monthly active users (roughly 10% of the EU's population) and are designated as such by the Commission.
VLOPs and VLOSEs will have to diligently identify, analyse and assess the systemic risks associated with the design or functioning of their service and put in place reasonable, proportionate and effective measures to mitigate the systemic risks identified. The risk assessment conducted should be specific to their services and proportionate to the systemic risks. Categories of risk to consider include, for example, those which have an effect on fundamental rights, civic discourse or electoral processes, public health, minors, physical and mental well-being and dissemination of illegal content. This list appears to be wide enough to include both illegal and harmful but legal content (see "Interaction with the UK Online Safety Bill" below).
This is not an insignificant undertaking; VLOPs and VLOSEs are required to conduct a risk assessment at least once a year before deploying functionalities that are likely to have a critical impact on the risks identified in the DSA. It would be worth VLOPs and VLOSEs considering adapting any relevant design or function of their services accordingly (for example, enhancing content moderation systems, adapting terms and conditions, adapting algorithmic systems or advertising systems, etc.).
Among other additional transparency requirements, VLOPs and VLOSEs are also obligated to establish a compliance function (which is independent from their operational functions and composed of one or more compliance officers) to ensure compliance with the obligations under the DSA, submit themselves to independent compliance audits, and provide the Digital Services Coordinator of their establishment with access to data under certain circumstances.
- Illegal content: notice and action mechanism
To achieve the objective of ensuring a "safe, predictable and trusted online environment" and to further combat illegal content online, all online platforms that provide hosting services will be required to put in place user-friendly notice and action mechanisms, which will allow third parties to notify the platform of illegal content on their services. The host will be required to implement a mechanism to facilitate the submission of detailed notices, which pinpoint the exact electronic location of the information (for example, by providing a URL).
The host will be required to make a decision on any notice it receives in a "timely, diligent, non-arbitrary and objective manner", and notify the individual or entity (that submitted the request) of that decision and the redress options available. The individual or entity that submitted the content in the first place must also be notified in the same manner.
Hosting service providers that are also online platforms must offer an internal complaint handling procedure as one of the redress options following the takedown decision. The DSA also envisages a regime of new out-of-court settlement bodies to deal with disputes relating to the removal of illegal content (that have not been resolved through the internal complaint handling process). This is unsurprising given that early on in the legislative process, the European Parliament stated that the the DSA should not oblige content hosting platforms to employ any form of fully automated ex-ante controls of content, and the final decision regarding the legality of content must be made by an independent judiciary and not a private commercial entity.
Whilst the DSA confirms there is still no general obligation to monitor the information which intermediary service providers transmit or store, nor actively seek facts indicating illegal activity, the additional requirements will be a considerable change for many online platforms for which this will be a more transparent and resource intensive burden going forwards (see "Reinforcing the liability shield" below).
- Reinforcing the liability shield
The DSA retains the so-called liability shield originally introduced under the e-Commerce Directive. This is the regime that effectively provides a defence against liability for illegal content in respect of online intermediaries who are merely providing a "conduit" for information, are carrying out routine "caching" of information, or are "hosting" information in circumstances where they have no knowledge of the illegal content.
The DSA appears to strengthen the liability shield with the introduction of the so-called "Good Samaritan" clause. This provision essentially means that intermediary service providers who decide in good faith and in a diligent manner, to voluntarily implement measures to detect and remove illegal content will not be prevented from availing themselves of the liability shield. This new provision appears to provide a degree of comfort that any voluntary monitoring mechanisms will not count against them in this way. The Commission clearly hopes to incentivise more voluntary activity in this area, however, it remains to be seen how this rule will apply in practice.
- Strengthening online advertising transparency
The DSA seeks to achieve greater transparency around online advertising. This is a "softer" approach when compared to one of the more controversial initial recommendations from the European Commission to phase out target advertising (eventually leading to a general prohibition within the EU).
All online platforms presenting advertisements online will be required to ensure that individuals using their services can identify "in a clear, concise and unambiguous manner and in real time":
- that the information is an advertisement (including through prominent markings);
- the natural or legal person on whose behalf the advertisement is presented (and the person who paid for it, where this differs); and
- meaningful information about the parameters used to determine whom the advertisement is presented to (which should be directly and easily accessible from the advertisement itself) and, where applicable, how to change those parameters.
In reality, these requirements are likely to overlap with the growing "transparency" trend arising from other legal requirements, for example, those under the GDPR (if personal data is involved) and the proposed AI Act (for example, where data is processed using AI means).
VLOPs and VLOSEs will be subject to additional advertising and transparency obligations, including a requirement to compile and make publicly available a repository in a specific section of their online interface, through a searchable and reliable tool, containing historic information as to the content, targeting, persons on whose behalf the advertisement is presented (and who paid for it, where this differs), and total number of recipients of advertisements. This information must be made available for the entire period that the advertisement is presented, and for one year after it is presented for the last time on the platform. Whilst some of the major platforms have already implemented advertisement repositories in anticipation of this requirement, it will still invariably cause headaches for major social media platforms, as well as other platforms.
These obligations run in parallel with those set out in the DMA requiring designated "gatekeepers" to provide advertisers and publishers to whom they supply online advertising services with information about the price paid and remuneration received, as well as how these were calculated.
- Transparent recommender systems and algorithmic decision making
Recommender systems rely on previous choices that a user has made (such as watching a particular film) to predict what other content the user might like, and in many cases this can be a relatively innocuous way of targeting content. They can have a significant impact on the ability of recipients to retrieve and interact with information online, and play an important role in the "amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour".
These forms of automated decision-making and targeting have therefore been on the radar for regulators for some time. The DSA requires online platforms that use recommender systems to set out in their terms and conditions the main parameters used in the systems, as well as any options for the recipients to modify or influence those parameters that they may have made available (for VLOPs and VLOSEs, this should include at least one option that is not based on profiling of the recipient). The information should be set out in a clear, accessible and easily comprehensible manner, and options to modify parameters should include easily accessible functionality on their online interface.
More generally, providers of intermediary services are also required to include information in their terms and conditions on any restrictions that they impose on the use of their service by its recipients. This includes information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review, as well as the rules of procedure of their internal complaint handling system. Intermediary service providers must act in an objective and proportionate manner in applying these restrictions, including with regard to the fundamental rights and legitimate interests of all those involved.
- Ban on dark patterns
To prevent consumers from engaging in unwanted behaviours, the DSA bans all online service providers from utilising 'dark patterns' on their platforms. Dark patterns are defined as "practices that materially distort or impair, either on purpose or in effect, the ability of recipients of the service to make autonomous and informed choices or decisions", and essentially encompass any process or design which unreasonably biases the decision-making of consumers. The DSA gives numerous examples of dark patterns, which include:
-
- implementing exploitative design choices that direct users to actions that "benefit the provider of online platforms, but which may not be in the recipients' interests";
- presenting options in a biased manner;
- making the process of cancelling a service more difficult than the process of signing up; and
- default settings that are cumbersome to change
- Measures to protect minors
The DSA includes provisions dealing specifically with minors. For example, providers of online platforms accessible to minors are required to put in place appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors, on their service. The Commission, after consulting the European Board for Digital Services, may issue guidelines to assist providers with this task.
Providers must also not present adverts on their interface to recipients based on profiling (that uses recipient personal data) where the provider is reasonably certain the recipient is a minor.
- Traceability of business users and illegal goods
The DSA includes various provisions that apply to online platform providers that allow consumers to conclude distance contracts with traders. These include new obligations on the traceability of business users in online marketplaces, to help identify and track down sellers of illegal goods (which will be particularly relevant for sites such as Amazon). All online platforms will be required to vet the credentials of third party suppliers ("know your business customer") that conclude distance contracts with consumers through their platform. Traders will be required to provide certain essential information to the online platform, including a self-certification by the trader that it only offers goods or services which comply with EU law.
- Crisis response mechanism
In response to the situation in Ukraine and the impact on the manipulation of online information, the DSA introduces a crisis response mechanism to address "extraordinary circumstances affecting public security or public health". The Commission aims to ensure the crisis protocols include one or more of the following elements:
- displaying information on the crisis situation provided by Member States' authorities or at Union level;
- requiring online services providers to designate a specific point of contact for crisis management; and/or
- adapting the resources dedicated to compliance with the obligations of the DSA to the needs arising from the crisis.
The crisis response mechanism will enable the analysis of the impact of VLOPs and VLOSEs on the crisis in question and facilitate a decision on proportionate and effective measures to ensure that fundamental rights are upheld.
- Enforcement and fines: the sting in the tail
EU Member States will each be required to appoint a 'Digital Services Coordinator' which will be responsible for supervising the intermediary services established in their Member State (including verifying platform user numbers in the EU and designating platforms as very large online platforms at least every six months), as well as co-ordinating with specialist sectoral authorities. The Member State in which the main establishment of the service provider is located will have power to enforce the DSA and the Commission will also have powers to supervise and enforce the higher tier obligations placed on VLOPs and VLOSEs. In addition, a new body (known as the European Board for Digital Services) will be created at an EU level to coordinate compliance and enforcement and act as an advisory board. It will comprise the Digital Services Coordinators from each Member State and, unlike the equivalent EU GDPR's European Data Protection Board, will be chaired by the Commission. Failure to comply with the DSA can result in EU GDPR-style fines of up to 6% of the worldwide annual income or turnover of the provider or platform. The size of the fine will be linked to the severity of the breach, as well as the duration and frequency of the violation. The Member States or the Commission may also impose fines of up to 1% of annual income or turnover of the provider or platform for providing incorrect, incomplete, or misleading information in response to a request for information and failure to submit to an inspection.
Before adopting a decision of non-compliance or imposing a fine or penalty against a VLOP or VLOSE, the Commission must inform the provider of its preliminary findings. The provider has a right to be heard through an appeals process and the CJEU also has unlimited jurisdiction to review a decision by the Commission to impose a fine or penalty, which it may cancel, reduce or increase.
Recipients of a service regulated under the DSA also have a right to lodge a complaint against an intermediary service provider with the Digital Services Coordinator of the Member State where the recipient of the service is located or established, alleging non-compliance with the DSA. The recipient is entitled to seek compensation from the intermediary service provider for damage or loss suffered due to the provider's non-compliance, in accordance with national or EU law.
When will the DSA start to apply? A staggered start
The DSA was signed by both the President of the European Parliament and the President of the Council on 19 October. It will soon be published in the Official Journal of the European Union and will come into force 20 days after publication (likely to be November 2022). The DSA takes the form of a Regulation and will therefore be directly applicable to all Member States without the need to implement the legislation at a national level.
The majority of the provisions in the DSA will apply from early 2024 (the later of 15 months after entering into force or 1 January 2024), with certain requirements applying at an earlier date. Online service providers will have three months after the DSA enters into force to publish the average number of monthly active recipients of the service in the EU in a publicly available section of their online interface (expected to be by February 2023), which will help determine whether providers fall within scope of VLOP or VLOSE. The provisions relating to VLOPs or VLOSEs will apply within four months of the VLOP or VLOSE being notified of its designation by the Commission, if earlier than the application of the DSA as a whole.
The likely impact for UK online service providers
- Extra-territorial effect
Given the global nature of digital services, the DSA also applies to service providers that are established outside the EU but offer services within the EU (as well as to EU-based providers), and therefore has extra-territorial effect. As the UK no longer forms part of the EU, this will include UK-based providers offering services in the EU. The DSA stipulates that non-EU-based providers must designate a legal representative within the EU to ensure effective oversight and compliance with the new rules, as well as to facilitate cooperation with the relevant authorities. For example, a non-EU-based provider may appoint its EU-based subsidiary or parent company as its legal representative. Importantly, these representatives may be held liable for any non-compliance with obligations under the DSA.
- Interaction with the UK Online Safety Bill:
In parallel, the UK's proposed OSB is intended to regulate online activity to enhance safety, whilst preserving free expression and democratic debate. You can read more about the draft OSB in our post here. In July 2022 the OSB was due to reach the Report Stage and Third Reading in the House of Commons, but its progress was delayed following Boris Johnson's resignation, see our post here.
Whilst the DSA intends to regulate illegal content, the proposed OSB originally intended to regulate content that is "legal but harmful" as well. In both cases, there are likely to be grey areas that will test the limits of the legislation and cause difficulties for service providers trying to interpret their obligations; for example some content which may not technically be illegal, may nevertheless be considered extremely harmful or dangerous and could be caught by the notice and action mechanism under the DSA.
The scope of "legal but harmful" content has been the cause of much controversy and debate in the UK, particularly in the context of freedom of speech. In her speech at the Conservative Party Conference earlier this month, the current Culture Secretary Michelle Donelan, indicated that the OSB will return to the House of Commons (possibly as early as 1 November 2022) and has signalled that changes to the Bill are likely to focus on unpicking restrictions for adults but not children. It is currently unclear whether this timetable will remain the case following the recent appointment of Rishi Sunak as prime minister. Now that the DSA has been finalised, it will be interesting to see whether the OSB is amended to align more closely with the EU's approach. Either way, the OSB is likely to continue to be a high priority for the government, particularly in the wake of the recent landmark conclusion in the Molly Russell inquest that social media had contributed to Molly's death.
The extent to which the UK will either want or be able to coordinate its own enforcement of digital regulation with the new Digital Services Coordinators across the EU and how the enforcement regime will apply in respect of UK entities remains to be seen.
How can organisations prepare?
From a practical perspective, those intermediary service providers in scope of the DSA, whether in the EU/EEA or in other territories such as the UK, should determine which tier of obligations may apply to them, keeping an eye on any guidance issued on this categorisation in due course.
Organisations within scope should also begin to review documents (such as terms and conditions), interfaces, internal processes (such as content moderation, algorithmic decision-making systems, recommender systems, take down mechanisms and complaints procedures) and governance functions, to identify any compliance gaps and put in place plans to remediate any gaps identified. This will require time and investment, both in terms of technical and human resources.
Given the additional obligations placed on VLOPs and VLOSEs and the fact that these obligations will kick in earlier than other parts of the DSA, the task at hand will be a sizable one. Those organisations that expect to be designated as such will need to swiftly plan to ensure any remediation steps are implemented in sufficient time.
The Commission and European Board for Digital Services will develop and implement voluntary standards set by European and international standardisation bodies in a range of different areas (e.g. templates, design and process standards for communicating with recipients of a service in a user-friendly manner, technical measures to enable compliance with the transparent advertising obligations (e.g. prominent marking of advertisements and commercial communications)). The DSA also requires the Commission and the Board to facilitate voluntary codes of conduct at EU level to contribute to the "proper application of the Regulation" taking into account the challenges of addressing different types of illegal content and systemic risks. These voluntary standards and codes of conduct will be welcomed by in-scope service providers to assist with compliance and interpretation of the legislation.
For those in scope of both the UK and EU legislative packages, it is also worth considering how your compliance programme can be carried out most efficiently – given the differences and potential overlap between the two regimes as well as the disparity in timing between them. Operationally, the most practical solution may require organisations to apply the higher of the two standards for consistency across both jurisdictions.
Where to next for the Commission? Whilst the DSA and DMA are two key pillars of the European Digital Strategy, there are a number of other supporting initiatives in the pipeline including, for example, the proposed AI Act, AI Liability Directive, Data Act and European Media Freedom Act (which is likely to build on the DSA). Last month, President Ursula von der Leyen also published a letter of intent, indicating that following the implementation of the DSA, the EU would continue to look at new digital opportunities and trends such as the metaverse. The letter also lists the metaverse and regulation of virtual words as a key new initiative for 2023.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.