The EU Digital Services ActOpens in new window ("DSA") is set to be signed by the Presidents of the European Parliament and Council on 19 October 2022. It will then be published in the official Journal, and will come into force 20 days after publication. The bulk of the DSA's provisions will apply 15 months after it enters into force (i.e. February 2024).
The DSA contains new rules to ensure greater accountability on how online intermediary service providers (who provide recipients with access to goods, services and content) moderate content, advertise, and use algorithmic processes. It gives practical effect to the principle that what is illegal offline, should be illegal online. The greater the size, the greater the responsibilities of intermediary service providers. In this article, we discuss what this ground-breaking new law means for such providers.
Why is the Digital Services Act so important?
The DSA (along with its sister legislation the 'Digital Markets Act' which we previously discussed here) has been lauded as "historic" by President of the European Commission, Ursula von der Leyen, and described as a "a world first in the field of digital regulation". The importance of the DSA lies in the significant new obligations it heralds for online intermediary service providers, including social media platforms, online marketplaces, and app stores.
Who does the DSA apply to?
The DSA takes a layered approach to regulation. Whilst the most basic obligations under the DSA apply to all online intermediary service providers, additional obligations apply to providers in other categories, with the heaviest regulation applying to very large online platforms ("VLOPs") and very large online service engines ("VLOSEs"). To understand the full scope of their responsibilities and liabilities under the DSA, intermediary service providers will need to determine which category they belong to.
The four categories are:
(1) Intermediary service providers are online services which consist of either a "mere conduit" service, a "caching" service; or a "hosting" service. Examples include online search engines, wireless local area networks, cloud infrastructure services, or content delivery networks.
(2) Hosting services are intermediary service providers who store information at the request of the service user. Examples include cloud services and services enabling sharing information and content online, including file storage and sharing.
(3) Online Platforms are hosting services which also disseminate the information they store to the public at the user's request. Examples include social media platforms, message boards, app stores, online forums, metaverse platforms, online marketplaces and travel and accommodation platforms.
(4) (a) VLOPs are online platforms having more than 45 million active monthly users in the EU (representing 10% of the population of the EU).
(4) (b) VLOSEs are online search engines having more than 45 million active monthly users in the EU (representing 10% of the population of the EU).
What is the territorial scope of the DSA?
The DSA has extra-territorial scope. It applies to the above categories of intermediary service providers who are established in the EU, and also to those providers established outside of the EU that offer services to users in the EU. When not established in the EU, intermediary service providers will have to appoint a legal representative in the EU, as many companies already do as part of their obligations under other legislation. Notably, the designated legal representative can be held liable for non-compliance with obligations under the DSA, without prejudice to the liability of the provider of the respective intermediary services.
What new obligations does the DSA introduce for intermediary service providers?
We have set out below some of the key obligations applicable to intermediary service providers under the DSA, depending on which category they fall within (i.e. see categories 1-4 above). The obligations of providers are aimed at matching their role, size and impact in the online ecosystem.
Obligations applicable to all providers of intermediary services listed at Categories 1-4 above:
- Transparency Reporting: Intermediary service providers must publish annual transparency reports on their content moderation activities, including the measures they take to apply and enforce their terms and conditions.
- Terms and Conditions: Intermediary service providers must have clear terms and conditions for their content moderation practices. They must also provide easily accessible information on the right to terminate the use of their services.
- Official Orders: All intermediary service providers that receive an order to act against illegal content must inform the relevant supervisory authority of any follow up given to the order, specifying if and when they followed the order. The same obligation applies to orders to provide information.
- Points of Contact & Legal Representative: Intermediary service providers mustdesignate a single electronic point of contact for official communication with supervisory authorities in the EU. As noted above, non-EU based providers must also appoint an EU legal representative.
Obligations applicable to all intermediary service providers listed at Categories 2-4 above:
- Notice & Action Mechanisms: Intermediary service providers must implement a notice and action mechanism for content that users consider illegal. Content targeting victims of cyber violence must be removed "immediately", and other content deemed illegal must be removed "swiftly".
- Statement of Reasons: Intermediary service providers must provide users with a statement of reasons whenever they delete or block access to their content for content moderation purposes. They must also provide such a statement when they restrict payments or suspend or terminate their own service or the user's account.
- Reporting Criminal Offences: If intermediary service providers suspect any serious criminal offences they must notify national law enforcement or judicial authorities.
Obligations applicable to all intermediary service providers listed at Categories 3- 4 above:
- Complaint & Redress Mechanism: Users will have new rights of redress, including a right to complain to the platform, seek out-of-court settlements, complain to their national authority in their own language, or seek compensation for any damage or loss suffered due to an infringement of the DSA. Representative organisations will also be able to defend user rights for large-scale breaches of the law.
- Trusted Flaggers: Platforms must cooperate with designated 'trusted flaggers' to identify and remove illegal content. Illegal content is defined as including any information that in itself or in relation to an activity, is not in compliance with EU or Member State law. The recitals to the DSA provides some illustrative examples of illegal content, such as the sharing of images depicting child sexual abuse.
- Bans on Targeted Advertising to Children and based on special category data: Significant curtails on targeted advertising, including a banon targeted advertising to children and those based on special categories of personal data, such as ethnicity; political views; sexual orientation; religion, or genetic or biometric data.
- Advertising Transparency: There are also new obligations in regard to advertising transparency, including a requirement to include meaningful information on why a user was targeted with a particular advertisement.
- Recommender Systems: Intermediary service providers that use recommender systems must set out in their terms and conditions the main parameters that determine how they suggest or prioritise information for users, as well as any options for users of the service to modify or influence those main parameters.
- Protection of Minors: Intermediary service providers must put in place "appropriate and proportionate measures" to ensure a high level of privacy, safety, and security of minors, on their service.
- Interface Design: A ban on so-called 'Dark Patterns'. Dark Patterns are designs used to manipulate users into choices they do not intend to make, by exploiting some cognitive bias. The ban on Dark Patterns extends to targeted advertising nudging users to purchase certain goods, or in recommender systems (algorithms which determine the content presented to a user) which use human cognitive traits to present content to users that are more likely to keep a user on a platform as long as possible.
- Traceability of Traders: If intermediary service providers enable consumers to conclude contracts with traders (e.g. on line marketplaces), they must ensure traceability of these traders by collecting and assessing the veracity of basic trader information. If the platform becomes aware of an illegal product or service offered by the trader, it must inform affected consumers.
Additional Obligations for VLOPs and VLOSEs at Category 4 above
The DSA introduces another layer of obligations specific to VLOPs and VLOSEs due to their systemic impact in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas, including:
- Annual Risk Assessments: An obligation to carry out annual risk assessments and put in place risk mitigation measures regarding any systemic risks, including the dissemination of illegal content and negative effects for fundamental rights.
- Audits: VLOPs and VLOSEs will be subject to enhanced transparency obligations, including annual independent auditsto assess their compliance with their obligations under the DSA, at their own expense.
- Compliance function: Both VLOPs and VLOSEs must establish an independent compliance function within their organisations, which reports directly to the board and which is made up of suitably qualified professionals who are adequately trained. This is analogous to the concept of a 'Data Protection Officer' under the GDPR.
- Recommender systems: VLOPs must provide users with at least one option to choose a recommender system that is not based on profiling.
- Data access & scrutiny: VLOPs and VLOSEs must provide regulators with access to any data that is necessary for the purpose of assessing their compliance with the DSA. Upon request from the competent regulator, VLOPs and VLOSEs must also provide vetted researchers with access to certain data in order to understand how online risks evolve.
- Additional advertising transparency: VLOPs and VLOSEs must provide a repository, where recipients can access information on online advertising that was displayed within the last year. Such information includes the content of the online advertisement, its principal, period and target groups. These rules may pose a significant challenge to the protection of trade secrets.
- Crisis Response Cooperation: VLOPs and VLOSEs must implement a crisis response mechanism and follow directions given by the European Commission concerning specific actions on content during social and political emergencies, such as pandemic or war.
Will the DSA replace the eCommerce Directive?
The DSA will not replace the eCommerce Directive, which remains the cornerstone legal framework for all digital services. However, in order to provide greater harmonisation, the DSA incorporates the existing rules exempting online intermediaries from liability for the content they host under certain conditions to ensure innovative services can continue to emerge and scale up in the single market.
Who will enforce the DSA?
For intermediary service providers, the supervisory authority will be the Digital Services Coordinator in the Member State in which the provider has its main establishment (or in respect of providers that do not have an establishment in the EU, but offer services in the EU, the Member State where their legal representative resides or is established). In Ireland, the Media Commission will be empowered to regulate intermediary service providers. It will have the power to impose penalties, including financial fines.
Each Member State will specify the penalties in their national laws, ensuring they are proportionate to the nature and gravity of the infringement, yet dissuasive to ensure compliance. The DSA only specifies the maximum fine will be 6% of the annual worldwide turnover of the provider of intermediary services concerned in the preceding financial year. For the supply of incorrect, incomplete, or misleading information, failure to reply or rectify such information, and failure to submit to an inspection, the maximum fine will be 1% of the annual income or worldwide turnover of the provider of intermediary services or person concerned in the preceding financial year.
A late addition to the DSA was to provide the European Commission with direct and exclusive enforcement jurisdiction over the obligations specific to VLOPs and VLOSEs, along with any 'systemic' issues concerning VLOPs or VLOSEs. This means the European Commission alone has authority to enforce these specific obligations on VLOPs and VLOSEs. The Commission will similarly have the power to impose fines of up to 6% of the annual worldwide turnover of VLOPs or VLOSEs.
The enforcement mechanism is not only limited to fines, the Digital Services Coordinator and the European Commission will have the power to require immediate actions where necessary to address very serious harms, and platforms may offer commitments on how they will remedy them.
In addition, individuals will have the right to seek compensation from providers of intermediary services, in respect of any damage or loss suffered due to an infringement by those providers of their obligations under the DSA.
Battles of the future?
It is unclear at present how a number of areas of the DSA will be enforced, and it appears likely that there will be some disputes and challenges concerning certain obligations imposed under the DSA. These may include:
- how to determine if a practice is a 'Dark Pattern' and how 'Dark Patterns' are to be distinguished from acceptable uses of an online platform's functionality to promote business interests;
- how to ensure machine learning algorithms stay within the parameters set by the DSA, for example by not inadvertently inferring special category data (such as someone's political opinion or health status) when presenting someone with an advertisement;
- how to ensure recommender system transparency does not allow bad actors to 'game' an online platform by using key words it knows are more likely to be promoted by the algorithm and thereby ensure their content rises to the top of users' feeds;
- what information is sufficient to put an online platform on notice that a user is a child (and therefore prohibited from being profiled for targeted advertising); and
- how the DSA will interact with other laws which apply to the same subject-matter, such as the GDPR and online safety legislation.
When will the DSA come into force?
The DSA is a Regulation, and will be directly applicable across the EU 15 months after its entry into force. Under the anticipated timetable, businesses can expect that the DSA be published in the official Journal shortly after it is signed by the Presidents of the European Parliament and Council on 19 October 2022, and enter into force around mid-November 2022 (20 days after publication), with the bulk of provisions then taking effect in mid-February 2024 (15 months later). However, the DSA may kick in sooner for VLOPs and VLOSEs, as it will apply to them four months after their designation by the European Commission. Designation by the European Commission will take place on the basis of user numbers reported by these service providers, which service providers will have three months after entry into force of the DSA to provide.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.