Addressing the detection of and removal of illegal content from online platforms represents an urgent challenge for the digital society today. However, so far, there is no harmonised and coherent approach across the European Union. On 28 September 2017, the European Commission ("Commission") published a communication titled Tackling Illegal Content Online – Towards an enhanced responsibility of online platforms" ("Communication"). The Commission calls for a more aligned approach as it would make the fight against illegal content more effective. An aligned approach would also benefit the development of the Digital Single Market. The Commission stresses that online platforms carry a significant societal responsibility and shall, therefore, decisively step up their actions to address this problem.

Scope of the Communication

The Communication does not as such change the existing legal framework. It rather lays down a set of non-binding guidelines and principles for online platforms to step up the fight against illegal content online in cooperation with national authorities, Member States, and other relevant stakeholders: "It aims to facilitate and intensify the implementation of good practices for preventing, detecting, removing and disabling access to illegal content so as to ensure the effective removal of illegal content, increased transparency and the protection of fundamental rights online. It also aims to provide clarifications to platforms on their liability when they take proactive steps to detect, remove or disable access to illegal content (the so-called "Good Samarian" actions)."

The Communication does not only target the detection and removal of illegal content; but it also takes into account issues arising from removal of legal content ("Over-Removal"), which may impact the freedom of expression and media pluralism. Therefore, the Commission calls for adequate safeguards which shall properly prevent Over-Removal.

Hosting defence

The Commission expressly acknowledges the "hosting defence" under Article 14 of the existing E-Commerce Directive 2000/31/EC ("E-Commerce Directive"), according to which hosting service providers cannot be held liable for the information stored at the request of third parties, provided that:

  1. they do not have actual knowledge of the illegal activity or information and, regarding claims for damages, is not aware of facts or circumstances from which the illegal activity or information is apparent; or
  2. upon obtaining such knowledge or awareness, they act expeditiously to remove or disable access to the information.

The Commission also refers to Article 15 of the E-Commerce Directive, which prohibits Member States from imposing a general obligation on providers of services covered by Article 14 to monitor the information which they transmit or store, and also a general obligation actively to seek facts or circumstances indicating illegal activity. In the view of the Commission, this prohibition does not automatically cover monitoring obligations in a specific case. In addition, voluntary measures within the limits of the applicable rules of EU and national law, in particular on the protection of privacy and personal data, shall be permitted.

The Commission emphasizes that the E-Commerce Directive should "constitute the appropriate basis for the development of rapid and reliable procedures for removing and disabling access to illegal information". Within this legal framework, the Commission intends to maintain a balanced and predictable liability regime for online platforms, as a key regulatory framework supporting digital innovation across the Digital Single Market.

Proposed measures

In the Communication, the Commission addresses the following proposed measures:

  • Detecting and notifying illegal content. In particular, online platforms should:

    • Cooperate closely with law enforcement and other competent authorities. This shall include the appointment of effective points of contact in the EU, and development of technical interfaces that allow platforms and law enforcement authorities to cooperate more effectively in the entire content governance cycle.
    • Facilitate a privileged channel for so-called "trusted flaggers", i.e., specialized entities with specific expertise in identifying illegal content, and dedicated structures for detecting and identifying such content online. Notices from trusted flaggers should be able to be fast-tracked by the platform.
    • Establish an easily accessible and user-friendly mechanism that allows their users to notify content considered to be illegal.
    • Put in place effective mechanisms to facilitate the submission of notices that are sufficiently precise and adequately substantiated. This shall generally require (a) an explanation of the reason and (b) an indication of the location (e.g., URL). Where possible, it shall be possible to submit notices on an anonymous basis. However, flaggers shall be able to submit their contact details so that they can receive follow-up messages form the platform, if they wish.
    • Adopt effective pro-active measures to detect and remove illegal content online; in particular, use automatic detection and filtering technologies. In the Commission's view, such pro-active measures shall not necessarily lead to the online platform not benefitting from the hosting exemption under Article 14 of the E-Commerce Directive, as long as the platform does "not play an active role of such kind as to give it knowledge of, or control over, that information". This interpretation shall be in line with the balance between the different interests at stake, which is a key rationale of the E-Commerce Directive.
  • Removing illegal content. In accordance with Article 14 of the E-Commerce Directive, platforms should remove illegal content as fast as possible. The Commission intends to further analyse the question of fixed time frames for removal in the future. At the same time, removal of such content should not impede the prosecution of or other follow-up to any underlying breach of law. By contrast, robust safeguards to limit the risk of removal of legal content should be available. In particular, online platforms should:Ensure expeditious removal and reporting crime to law enforcement authorities. The Commission takes the view that fully automated deletion or suspension of content may be of particular efficacy. Removal deriving from trusted flagger notices should be addressed more quickly.

    • Ensure expeditious removal and reporting crime to law enforcement authorities. The Commission takes the view that fully automated deletion or suspension of content may be of particular efficacy. Removal deriving from trusted flagger notices should be addressed more quickly.
    • Report to law enforcement authorities whenever they are made aware of or encounter evidence of criminal or other offences. However, in doing so, they should comply with applicable legal requirements, especially the lawful grounds for processing personal data under the upcoming General Data Protection Regulation.
    • Provide a clear, easily understandable, and sufficiently detailed explanation of their content policy in their terms of service.
    • Publish transparency reports with sufficiently detailed information on the number and type of notices received and actions taken.
    • Implement safeguards against over-removal and abuse of the system. In particular, those who provided the content should be given the opportunity to contest the decision via counter-notice. If it should turn out that the notified activity or content is not illegal, the platform provider should restore the content that was removed without undue delay, or allow for the re-upload by the user, without prejudice to the platform's terms of service.
  • Preventing the re-appearance of illegal content. In particular, online platforms should:

    • Take measures which dissuade users from repeatedly uploading illegal content of the same nature, and aim to effectively disrupt the dissemination of such illegal content.
    • Use and develop automatic re-upload filters. The use of such technology should be made transparent in the platform's terms of service.

What's next?

The Commission expects online platforms to take swift actions over the coming months, in particular in the area of terrorism and illegal hate speech. The Commission announces to continue having exchanges and dialogues with online platforms and other relevant stakeholders. In particular, the Commission plans to monitor progress and assess whether additional measures, including future legislative measures, will be needed. The Commission estimates that this work shall be completed by May 2018.

Stakeholders have already started to raise concerns against the Commission's approach. For example, the German Association of the Internet Industry eco argues that the existing legal framework under the E-Commerce Directive shall provide for a sufficient legal basis to address the detection and removal of illegal content. The eco press release dated 28 September 2017, can be found here (in German language).

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.