ARTICLE
15 December 2022

The Digital Services Act – "A New Sheriff In Town"

Thierry Breton, EU Commissioner for Internal Market, announced the Digital Services Act, which is intended to harmonise regulations on the internet at EU level, with the pictorial comparison of "a new sheriff in town".
Germany Media, Telecoms, IT, Entertainment

Thierry Breton, EU Commissioner for Internal Market, announced the Digital Services Act ("DSA"), which is intended to harmonise regulations on the internet at EU level, with the pictorial comparison of "a new sheriff in town".

It has been 20 years since the EU first laid down a basic legal framework for the regulation of the internet – namely the eCommerce Directive of 2000. Since then, it has been the Member States who took the lead on making the internet a safer or at least better place – however, on national level (e.g. in Germany with the Network Enforcement Act). This led to regulations regarding the internet being very inconsistent across the EU. From a business perspective, at times there was the impression of a fragmentation of the "European Internet".

Therefore, the EU Parliament and eventually the EU Council on October 4, 2022 approved the DSA. Other than hoped by some and feared by others, the DSA is not a "constitutional law for platforms". Rather it contains basic rules for the so-called intermediary services providers.

The underlying idea of the DSA is: "What is illegal offline, should be illegal online". At first glance, this sounds like an obvious truism, and in fact the regulations of the DSA are more of a tightening up or standardisation of regulations that already exist in many member states. However, many new obligations have been added with the real novelty being the possibility of initiating sanctions against companies along with a system of fines modelled on the GDPR).

To whom the DSA applies

The DSA is aimed at "intermediary services providers" who offer their services in the EU. This includes, for example, internet providers, cloud services and content sharing platforms, but also social networks, app stores and online marketplaces.

The extent of regulation depends on the respective type of intermediary service. A distinction is made between the pure transmission of data ("mere conduit"), transmission with short-term intermediate storage ("caching") and "hosting", with the special case of online platforms. The strictest regulations apply to "very large online platforms" and "very large online search engines".

Areas of regulation of the DSA – Harmonising Limited Liability Exemptions and establishing Due Diligence Obligations

At first, the DSA establishes a standardised legal framework for the conditional exemption from liability of intermediary service providers for the data or content they transmit. The exemption is mainly based on the knowledge of the intermediary services providers of the illegality of the content.

Due Diligence Obligations applying to All Intermediary Services

The DSA furthermore lists obligations of the intermediary services providers, some of which are very detailed.

  • Providers of intermediary services must act against illegal content when ordered to do so by the relevant national judicial or administrative authority. However, there neither a general duty of the intermediary services providers to monitor all information, nor are they obliged to actively search for facts indicating illegal activities without prior indications.
  • Intermediary services providers are required to establish a single point of contact for direct communication with the authorities and the Commission, as well as a point of contact enabling the users of the service to communicate directly and rapidly with the services providers.
  • Additional information will need to be provided in the terms and conditions of the services providers. This will ensure the fundamental rights of users, such as freedom of expression, freedom and pluralism of the media, and other fundamental rights are adequately reflected in the terms and conditions. The information shall cover restrictions on content as well as the policies, procedures and tools used for content moderation, as well as the internal procedures for handling complaints.
  • Where the intermediary service is primarily directed at or used by minors, the terms and conditions must explain the conditions for and restrictions on the use of the service in a way minors can understand. For providers of intermediary services of any kind, it is advisable to review their general terms and conditions at an early stage according to these standards.
  • There are now also transparency obligations (e.g. annual reports) for all services providers. The scope of this obligation varies depending on the type of intermediation service.

Specific Obligations for Hosting Services (including Online Platforms)

Providers of hosting services must implement notice and action mechanisms. These must include a report function for illegal content which is easily accessible for users. If restrictions are imposed on user content or behaviour, the services provider must give a clear and specific statement of reasons for the restrictions to any affected recipient of the service. If a hosting provider becomes aware of any information that gives rise to the suspicion of a criminal offence involving a threat to the life or safety of a person, the provider must inform the relevant authorities.

Special Category: Online Platforms

Online platforms, such as social networks or online marketplaces, are defined as providers of hosting services that not only store information provided by the recipients of the service but also disseminate such information to the public at the recipient's request. Such online platforms will have further obligations.

  • Online platforms must establish an internal complaint procedure and out-of-court dispute resolution.
  • They shall process notices about illegal content given by "trusted flaggers" without undue delay.
  • The DSA entails detailed provisions on how to deal with users that frequently provide manifestly illegal content.
  • New transparency obligations apply for advertising on online platforms. Generally, users need to be provided with information about the advertiser and the person who paid for the advertising.
  • Online platform providers using "recommender systems" must inform users about the main parameters they use for these recommender systems and what options users have to modify or influence these parameters. This should be detailed in the platform's terms and conditions.
  • Online platforms must not present advertising based on profiling that uses "sensitive" personal data, as defined in Art. 9 of the GDPR. This may even apply where the user has consented to the processing of their personal data. Personalised advertising based on profiling to minors must not be presented by online platforms where they are reasonably certain the user is a minor. Online platform providers should review functions such as "recommended for you" or similar. Such algorithm-based suggestions often process user profiles for the purposes of personalised advertising within the meaning of the DSA.
  • The DSA expressly prohibits the use of "dark pattern", i.e. application interfaces that interfere with users' free decision-making, for example, by displaying different sizes of consent and rejection options.
  • Where a platform allows consumers to conclude distance contracts with traders, the platform must ensure traders are traceable and must therefore collect specific information about the trader's identity.

For very large online platforms having in average at least 45 million EU users per month, even more comprehensive transparency obligations apply. They must give their users the possibility to refuse recommendations based on profiling. They must establish risk management systems and meet specific compliance requirements. And they must be publicly accountable for meeting these requirements and will be subject to annual independent audits. In a crisis (such as war), very large online platforms may be subject to further obligations. These requirements also apply to very large search engines.

Enforcement and Sanctions

Non-compliance with the DSA can be punished with heavy fines of up to six percent of the group's annual turnover. The competent national authority of the member state in whose territorial jurisdiction the intermediary services provider falls is responsible for enforcing the regulations of the DSA and imposing the respective fines. In the case of very large online platforms/very large search engines, the responsibility here lies with the Commission.

Companies will not only be subject to obligations if they are providers of any kind of intermediary services; they will now have the possibility to take better action against illegal content or illegal products (for example, counterfeit products, etc.).

The new regulations will probably apply from February 2024 and even earlier for very large online platforms. It remains to be seen whether and how the "new sheriff" will ensure better control and security online.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More