On 22 June, the Court of Justice of the European Union issued its judgment on the liability of internet platforms for the unauthorised uploads of users in the joined cases of Youtube and Cyando.
The Grand Chamber of the Court of Justice of the European Union (CJEU) has issued its judgment on the liability of internet platforms in the joined cases of YouTube and Cyando (C-682/18 and C-683/18). In what will be a relief for platforms like YouTube, the CJEU found that internet platforms are not liable in principle for the unauthorised uploads of users, unless the platforms fail to take action to remove or block access to the content.
The two sets of proceedings concerned the unauthorised uploading of copyrighted works, specifically to YouTube, the well-known video sharing platform and Uploaded, a file hosting and sharing platform.
In the first case, music producer Frank Peterson brought proceedings against Google (owner of YouTube) regarding the uploading to YouTube of various sound recordings over which he claims to hold the copyright. The second case was brought by academic publisher Elsevier against Cyando (owner of Uploaded), relating to uploading copyright-protected works to their platform.
The German Federal Court of Justice made a joint reference for a preliminary ruling to the CJEU. The reference primarily concerned the interpretation of the following provisions of EU law:
- The author's exclusive right of communication to the public under Article 3(1) of the InfoSoc Directive (2001/29/EC); and
- The intermediaries' exemption from liability for the content they store at the request of their users under Article 14(1) of the E-Commerce Directive (2000/31/EC).
Below, we analyse the CJEU's findings.
Communication to the Public
The CJEU, while recognising the indispensable role played by the platform operator, noted that illegal content was uploaded to the platforms by users acting autonomously with responsibility for their own actions. On both the YouTube and Uploaded platforms, it is the users who determine whether the content they upload is made available to other internet users.
As such, the users of the platforms carried out an act of communication to the public. The indispensable role of the platform operator was not decisive. A key consideration in deciding whether the intervention of the platform operator would also constitute an act of communication to the public would be to determine whether the operator acted "deliberately", with full knowledge of the consequences of its actions. The CJEU stated that the following actions of the operator are instructive in this regard:
- the operator does not put in place appropriate technological measures that can be expected from a reasonably diligent operator to counter, credibly and effectively, copyright infringements on its platform, despite knowing (or having ought to have known) about the presence of illegal content on that platform;
- the operator participates in selecting protected content illegally communicated to the public;
- the operator provides tools specifically intended for the illegal sharing of such content on its platform; and
- the operator knowingly promotes such sharing, for example, by adopting a financial model which encourages its users to illegally communicate protected content to the public via its platform.
The mere fact that a platform operator is aware "in a general sense" that protected content is made available illegally on its platform, or the fact that the operator has the aim of making a profit will not, in itself, be sufficient. However, the CJEU confirmed that this assessment would be different if the platform fails to act expeditiously to remove or disable access to illegal content having received a notification from the rightsholder.
Safe Harbour Exemption
Under Article 14(1) of the E-Commerce Directive, an internet service provider is not liable for the information stored at the request of a recipient of the service, on condition that the provider does not have actual knowledge of illegal activity. Citing earlier case law, the CJEU confirmed that this safe harbour exemption is available only to a 'neutral' platform operator, that is to say, where its conduct is "merely technical, automatic and passive, which means that it has no knowledge of or control over the content it stores".
In considering the knowledge requirement, the CJEU (in a similar manner to its Article 3(1) analysis) considered that actual knowledge could not be inferred solely because the operator of a platform is aware in a general sense, or has "abstract knowledge" that its platform is used to share content which may infringe intellectual property rights. In this case, the CJEU concluded that the mere fact that a platform automatically indexes uploaded content, has a search function, and recommends videos based on users' profiles or preferences were not sufficient grounds to conclude that the platform operator has specific knowledge of illegal activities. According to the CJEU, such knowledge would be acquired where a provider is "aware of facts or circumstances from which the illegal activity or information is apparent".
The CJEU stated that this knowledge could be acquired when a provider uncovers the existence of illegal information, either as a result of its own investigations or through the notifications by a rightsholder. Considering this issue of take-down notices from rightsholders, the CJEU appeared keen to avoid a situation whereby providers become the judge. It stated that any such notice should contain sufficient information to enable the platform operator to satisfy itself (without detailed legal examination) that the content is illegal and that removing the content is compatible with freedom of expression.
The CJEU makes clear that the interpretations provided in its ruling apply solely to the law as it was prior to the entry into force of Directive (EU) 2019/790 (New Copyright Directive), and "do not concern the set of rules established by Article 17" of the New Copyright Directive. It is of particular note that the CJEU, in a departure from the opinion of Advocate General (AG) Saugmandsgaard Øe, did not engage with Article 17 of the New Copyright Directive in any form.
On the one hand, it might be argued that this lack of engagement with Article 17 of the New Copyright Directive may dilute the precedential value of this judgment. However, Article 3(1) of the InfoSoc Directive will continue to apply for certain internet platforms given that Article 17 only applies to "online content-sharing service providers". The New Copyright Directive defines this term as "provider[s] of information society services of which the main or one of the main purposes is to store and give the public access to a large amount of copyright-protected works or other protected subject matter uploaded by its users, which it organises and promotes for profit-making purposes".
We are likely to gain further insight into the CJEU's views on Article 17 in Republic of Poland v European Parliament and Council of the European Union (Case C-401/19), as Article 17(4) comes under challenge. The AG's opinion in this case is due on 15 July, and the CJEU judgment will follow sometime later.
In any event, given the lingering uncertainty about the present state of the law, it seems inevitable that references will ensue on the issue of internet platforms liability under the Article 17 regime of the New Copyright Directive.
Co author by Darragh Larkin, Summer Student
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.