Summary

As part of its Digital Single Market (DSM) strategy, the European Commission has emphasized the importance of fostering a "trusting, lawful and innovation-driven ecosystem" around online platforms in the EU. It has now proposed a common approach that operators of online platforms should take when tackling illegal content, and a Code of Practice on Disinformation.

But how practical are the EU's – largely self-regulatory – proposals for online platform operators, and how well do they fit with national laws in the EU? There are already concerns that the current German approach to online platforms goes too far towards restrictive regulation, potentially at the expense of fundamental rights such as freedom of expression.

Background

In May 2015, the European Commission launched its DSM strategy, consisting of three "pillars" and 16 "key actions". We have written a number of articles following the DSM's progress: on its inception, one year in, and in 2017 following a mid-term review.

A key focus of the DSM strategy is the better regulation of online platforms – more specifically, to allow online platform ecosystems to thrive, while at the same time ensuring that their users are treated fairly, and that the spread of illegal content through such platforms is limited. We have previously written about the Commission's approach to online platforms.

Online platforms have not only revolutionized how organizations process transactions but also how information is spread and consumed. Individuals can now share their opinions and comments with the public and play an active role in the publication of content. The EU acknowledges that online platforms have encouraged diversity and fostered democratic values. But online platforms also allow illegal content and false information to be produced, widely disseminated, and promoted.

While online platforms wield enormous power in the economy today, policymakers struggle to regulate platform providers that largely contribute to content production. What should regulators do to keep pace with increasingly powerful platforms and the spread of illegal content and disinformation on such platforms?

The Commission's Self-Regulatory Approach to Illegal Content

In the Commission's view, the removal of illegal content, such as posts inciting hatred, violence, or terrorism, remains a concern that has not been adequately addressed in the past. Voluntary arrangements like the EU Internet Forum on the removal of terrorist content online have proven to be insufficient.

In March 2018, the Commission published a Recommendation proposing measures to ensure more effective handling of illegal content online. It builds on an earlier Communication about the responsibilities of online service providers with respect to illegal content online, which was adopted in September 2017. The Recommendation is intended to serve as non-binding legal guidance for online platforms. The Commission emphasizes that online platforms have particular societal responsibilities to help manage illegal content disseminated through the use of their services.

With this background, the Recommendation proposes a common approach to the detection, removal, and prevention of illegal content. Platform providers should:

  • Set out clear notification systems for users, such as proactive tools, to detect and remove illegal content;
  • Put in place appropriate safeguards, including human oversight and verification; and
  • Apply easy and transparent rules for determining illegal content, including fast-track procedures for "trusted flaggers", and removal of content.

The Commission further encourages closer cooperation between online platforms and law enforcement authorities. It encourages operators of online platforms to cooperate and share experiences among themselves and to develop best practices and technological solutions, such as automatic detection applications.

The German Regulatory Approach to Illegal Content

While the Commission believes that platforms will take their responsibility towards society seriously, Germany has already taken action against illegal content online with the Network Enforcement Act that took effect in January 2018. According to the Act, providers of social platforms are obliged to delete or block illegal content published on their platform. Posting content is illegal if the acting party violates the regulations of the German Criminal Code such as incitement to hatred, child pornography, or intentional defamation.

Where there is an obvious violation, the published content must be deleted within 24 hours of the user's complaint. If it is unclear if the posted content violates the Criminal Code, the platform operator has to reach a decision as to the content within seven days of the user's complaint. Also, the new German law requires social platforms that receive more than 100 complaints within a year to publish a transparency report every six months, setting out details about how many requests they had, which notification mechanisms they use, and what efforts they take to address illegal content. The first reports will be published by August 2018. Significant infringement of the main obligations, such as the duty to report, can result in penalties as high as €50 million.

Criticism of the German approach has been harsh. The law is hard to implement because most content can only be understood in its context, making the provision of accurate general guidance likely impossible. One of the main concerns is that the law will incentivize platforms to delete more content than necessary in order to secure compliance with the law (and to avoid the prospect of hefty fines) – a practice known as over-blocking. But over-blocking can adversely affect fundamental rights, such as freedom of expression and freedom of the press. Hence, the German law is widely perceived as an emergency solution. However, it has triggered a widespread debate about how to combat the spread of illegal content online in a practical way that can be realistically implemented.

A European Action Plan to Counter Online Disinformation

In April 2018, the Commission issued a further Communication on online disinformation. Disinformation is understood as verifiably false or misleading information that is created, presented, and disseminated for economic gain or to intentionally deceive the public, and may cause public harm. In order to protect European values, democratic systems, and policy-making processes, the Commission put forward an action plan and self-regulatory tools.

The Commission's proposal includes an EU-wide "Code of Practice on Disinformation", support of an independent network of fact checkers, and tools to stimulate quality journalism. The Code of Practice that's currently in active development is designed to:

  • Improve access to trustworthy information;
  • Ensure transparency about sponsored content related to electoral and policy-making processes; and
  • Establish clear marking systems and rules for bots to ensure that their activities cannot be confused with human interactions.

The Commission holds online platforms responsible for content governance, and therefore mandates self-regulation. The network of fact checkers will, however, act independently and work to achieve the broadest possible coverage of factual corrections across the EU. The fact checkers will be selected from the EU members of the International Fact Checking Network, a unit run by the Poynter Institute, a publisher and nonprofit school for journalists.

It remains to be seen whether the Commission's self-regulatory proposals or the German legislative approach are able to adequately address the problem of illegal content and disinformation online. The difficulty and cost associated with implementing the EU recommendations may prove untenable for businesses. The steps that have been taken so far suggest that neither the EU nor Germany have yet struck the right balance of the extent and kind of regulation. But it is clear that the jumbled nature and vast amount of existing regulation applicable to online platforms in the EU makes it difficult for operators to understand their legal position and how they ought to practically go about complying with the EU laws to which they are subject.

Digital Single Market

For more information about the Digital Single Market:

Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Morrison & Foerster LLP. All rights reserved