"The DSA marks the end of the era of large online platforms that behave as if they are too big to care," writes EU Commissioner Thierry Breton in response to the opening of proceedings by the EU Commission against X (formerly Twitter). The social media service X has been suspected several times of not taking sufficient action against fake news and hate speech on the platform. Most recently, there were investigations into the dissemination of illegal content in connection with attacks by Hamas against Israel. The first formal proceedings under the Digital Service Act (DSA) have now been opened against X for allegedly sharing "illegal content" and breaching transparency obligations under the DSA. The DSA, a new and far-reaching EU regulation on platform regulation, provides for stricter obligations for online platforms to take action against illegal and harmful content on the internet, which can result in fines of up to six per cent of the platform operator's global annual turnover in the event of violations. It goes without saying that large tech companies can be fined billions of euros. It is now being investigated whether X has violated the DSA's regulations on content moderation, risk management, advertising transparency and data access for scientists.

What is the DSA? - An overview

The DSA came into force on 16/11/2022 and applies throughout the EU. Online services have until 17 February 2024 to prepare for the adaptation to the new regulations and the technical implementation. However, the regulations for particularly large services such as X - so-called "Very Large Online Platforms" (VLOPs) - already apply now. As a law on digital services, the DSA, alongside the Digital Markets Act, the AI Act, the Data Act and, particularly Art. 17 of the Copyright Directive, is an ambitious part of the EU's digital strategy. It aims at a secure internet, free of illegal content, while promoting innovation and competitiveness of the EU in the digital economy. At its core, the DSA shall ensure transparency and accountability of digital service providers, regulate the influence of "big tech" and enforce the obligations that apply to them. Known obligations from the E-Commerce Directive will be supplemented and partially replaced by the DSA. The German NetzDG (Network Search Act) will become almost obsolete.

The DSA is intended to regulate online content and establish extended due diligence obligations for platforms. In particular, the DSA aims to regulate illegal content. This is content that does not comply with EU law or the law of a member state, which also includes hate speech. However, the point at which statements are actually to be classified as hate speech (e.g. those that violate personal rights) or whether they are still subject to the right to freedom of expression, leads to complex questions of judgement and represents one of the main problems in practical implementation. Critics are particularly concerned about the negative impact on fundamental European rights, such as freedom of expression, which could suffer from the complex bureaucratic structure of the EU package.

Among other things, the DSA stipulates that providers of hosting platforms must introduce reporting and remediation procedures and inform affected users of the reason for the deletion of their (allegedly) illegal content. Hosting services cover social media platforms or online marketplaces (eBay, Amazon, etc.). Whenever content is removed or blocked in the course of moderation, precise information about the facts and reasons (so-called "statements of reasons") must be provided to the affected users and information must be provided about complaint options. This applies to moderation based on reports as well as moderation based on the terms of use of the platforms.

To increase transparency, all digital service providers must also refer to their content moderation in their general terms and conditions in addition to clearly labelling advertising. In doing so, they should provide information on the process and guidelines for the restriction of content. In particular, the extent of algorithmic and human influence in the decision-making process should be reported. For example, large services must report the number of notifications received and the type of measures taken (e.g. whether the decision-making was automated). This is intended to give greater consideration to users' artistic freedom and freedom of expression. The DSA is also intended to curb "dark patterns" and "nudging" - a misleading platform design that is intended to impair the free and informed decision of users. It is also this practice that X is now being accused of in the current proceedings.

Who is affected by the DSA?

In principle, the DSA is generally aimed at all digital intermediary services targeting the EU that provide users with online access to goods, services and content. However, there are exceptions for smaller companies (< 50 employees, < 10 million euros annual turnover). The obligations imposed on online services are organised in a tier system and depend on the size and importance of the tech companies. Starting with regulations aimed at all digital intermediary services, the DSA addresses pure access providers, caching services, hosting services, social media platforms and online marketplaces, through to very large platforms and search engines (the "big tech" companies), which are affected by the strictest regulations.

What will become of the familiar liability privileges?

The DSA continues to uphold the liability privileges of the E-Commerce Directive for neutral and purely technical intermediary services. Hosting services (in particular online content sharing providers that transmit third-party user-generated content) are still required to remove illegal content once they become aware of it and are not obliged to carry out general proactive monitoring (notice and take down). Depending on the level of regulation, however, the monitoring obligations placed on providers are significantly more stringent. Similar to the previously enacted Copyright Directive, the EU is making platforms more accountable for illegal third-party content and directing the moderation of content further into the proactive area towards pre-filtering.

DSA Transparency Database for

In September 2023, the EU Commission implemented one of the most far-reaching transparency tools of the DSA - the so-called "Transparency Database" (Home - DSA Transparency Database (europa.eu)). The new database will collect the "statements of reasons" for the deletion and restriction of content and is the first of its kind in which data on the moderation of content by online platforms is made available to the public on an unprecedented scale. It is intended to contribute to Big Tech accountability. Only very large online platforms (VLOPs) are already required to submit data to the database. From 17 February 2024, all online platform providers, with the exception of micro and small businesses, will have to submit data on their content moderation. In the database, users will be able to view summarised statistics, search for specific reasons and download data. Just a few days after the database went live, over 10 million statements of reasons were reported, meaning that over 10 million removals or restrictions have taken place - whether due to illegality or a breach of the company's terms of use. This shows the extent of the removal of online content. The aim of the database is to increase transparency for users, but also to strengthen researchers' access to content moderation data in order to better understand misinformation and develop solutions. Since the takeover by Elon Musk, however, access for researchers to possible misinformation on X has become considerably more difficult.

What can online platforms expect?

Even if large services like X are particularly affected, the DSA is aimed at practically all digital companies. Companies operating in the EU should therefore examine the extent to which they are affected by the regulations and prepare themselves both technically and organisationally for implementation. Depending on the classification of the company, the introduction of certain processes should be considered as part of their compliance. However, the specific adjustments vary greatly due to the DSA's tiered system and must be tailored to the individual case. The extent to which an online shop is subject to the DSA in individual cases and which obligations must be observed cannot be answered in general terms. This was also shown by a list published by the Commission in April 2023, which included 17 very large online platforms, such as Zalando. Some of which then filed a complaint with the European Court of Justice against their classification as a very large online platform. Online shops in particular, should therefore inform themselves by 17 February 2024 and implement the obligations that apply to them.

This includes implementing content moderation processes and setting up the necessary reporting mechanisms as well as adapting the general terms and conditions. Above all, hosting services must create the technical requirements to transmit the statements of reasons to the Commission's database. The Commission provides the source code for this so that the messages can be sent to the database automatically.

UK's Online Safety Act

With similar objectives to the DSA, the United Kingdom has enacted the Online Safety Act to combat illegal content online, which shall be enforced by the regulatory authority Ofcom. The focus is also on online service providers and their inspection obligations. However, due to the differences in legislation, providers of online services targeting users in the EU and the UK must check which requirements of the individual legislation apply to them.

The proceedings against X

As a classified very large platform (VLOPs), X is subject to the strictest requirements of the DSA. X is now accused of violating the DSA's moderation rules. Researchers at a German university have already found that X moderates significantly less content than other large platforms.

The Transparency Database also shows that platforms - as expected - mostly use automated filters for moderation. It therefore illustrates that the masses of reports can only be moderated by automatic filters. However, it remains to be seen how the occurring risks to freedom of art and freedom of expression, which are (so far) often not sufficiently reliably assessed by algorithms, can be resolved. The proceedings against X are particularly interesting with regard to the question of whether and to what extent the EU's package of measures is effective against fake news, illegal content and in favour of greater transparency and the protection of freedom of expression and information. X counters the accusations that it is not taking sufficient action against illegal content by relying on the "wisdom of crowds" instead of deletions. X's "Community Notes" enable community-based fact-checking to curb false information on the internet. This function allows posts to be provided with context in order to add a fact check under a post, image or video. The EU Commission will now examine whether this fact-checking on X in the context of the DSA is sufficient as a primary antidote to misinformation. How the Commission will deal with this type of content moderation is still unclear. Further proceedings under the DSA have already been announced by the Commission against other social media platforms operated by Meta and TikTok.

Outlook

Platforms are therefore facing a number of challenges. In addition to the issues of platform liability for copyright infringements committed by platform users (Art. 17 Copyright Directive), operators are faced with an increasingly complex set of rules that they must observe and implement, as EU regulations gradually come into force. However, under which conditions content is illegal or harmful, how the technical implementation should actually do justice to the huge amounts of data without relying entirely on automated filters and how freedom of expression can be safeguarded at the same time has so far remained vague and ultimately in the hands of the platforms. The latter continue to be guided not only by the law but also by their own terms of use, meaning that the limits of freedom of expression in the digital space are still primarily controlled by the platforms. Initial analyses of the Transparency Database have shown that some of the platforms remove content on a large scale and others, such as X, remove significantly less content. The platforms follow their own guidelines and use different tools for moderation. Content is therefore also assessed independently of the question of illegality on the basis of the applicable terms of use and sometimes deleted, even if it is lawful.

It will soon become clear whether the DSA and the new database will actually make a tangible contribution to safeguarding freedom of expression on the internet and meaningfully regulate the influence of big tech. Whether the threat of high fines will encourage excessive removal and so-called chilling effects (the negative impact on fundamental rights such as art and freedom of expression) concerns the critics. It remains to be seen, how much the new Transparency Database can affect the moderation by tech companies and the freedom of expression online. Apart from that, the DSA, together with the other measures of the EU's digital strategy, will have a noticeable impact on all online service providers and the entire digital industry.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.