Case: C‑18/18; Eva Glawischnig-Piesczek vs Facebook Ireland Limited
On 3 October 2019 the Court of Justice of the European Union ("CJEU") provided guidance on Social Media operators obligations to monitor their networks for "hate speech".
1 Background of the case
In 2016 Eva Glawischnig-Piesczek, a (former) Austrian politician, was exposed to an offensive post on Facebook. Upon her request Austrian Courts obliged Facebook to remove the said post. However, it came under dispute as to whether a Social Media operator, like Facebook, is obliged to proactively remove "equivalent" posts, even in cases where the operator has not received any indication about allegedly unlawful posts being spread on its network. The Austrian Supreme Court has referred this question to the CJEU in light of Art 15 E-Commerce Directive which prevents Member States from imposing general monitoring duties on host providers (like, e.g., Social operators).
2 Outcome
The CJEU's ruling essentially forms a balancing of the right
to privacy vs freedom of speech. Thus, the CJEU's ruling and
its considerations of limitations need to be equally
considered:
The CJEU ruled as follows:
- Art 15 Para 1 E-Commerce Directive does not prevent national courts from obliging host providers to remove content that is identical to content which has previously been declared unlawful.
- Such obligation may apply irrespective of who has uploaded or otherwise provided that content.
- If national law provides the respective grounds, such obligation shall apply to the Social Media operator on a worldwide basis.
No less important are the CJEU's limiting considerations:
- The said obligation shall only apply where the equivalent, "initial" content has been declared unlawful by courts.
- The content to be removed shall reflect the same details as had been declared unlawful in the preceding court order.
- The provider shall not be obliged to autonomously assess the legitimacy of the content to be removed.
- Rather, the content to be removed needs to be so similar to the content previously declared unlawful, that the provider can refer to an automated means to identify and remove that content.
3 Comments
The sensitivity of this judgement asks for it to be handled with
care. On the one hand the CJEU has clarified that a general
obligation of a host provider to monitor content hosted on its
network does not necessarily interfere with Art 15 of the
E-Commerce Directive. On the other hand, the CJEU has established
sharp limitations to such obligation, not least to protect the
freedom of expression. In essence, any such proactive monitoring
shall be limited to what has previously (and precisely) been
defined by courts as unlawful, and, maybe most notably, the
operator shall not autonomously assess the lawfulness of the
content to be removed. In other words, it can be taken from the
CJEU's ruling that content which deviates in its statements,
but also in its phrases or language shall not be removed, as long
as it does not clearly correspond to content that was declared
unlawful by courts.
This decision might not have that much of an impact on Social Media
providers as it seems at first glance. It does not impose general
monitoring obligations but rather wants to ensure that information
gets removed if equivalent information has already been declared
unlawful by courts. This ratio certainly is reasonable, but must
not be overinterpreted by claiming proactive monitoring duties for
Social Media providers.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.