To help organisations stay on top of the main developments in European digital compliance, Morrison & Foerster's European Digital Regulatory Compliance team reports on some of the main topical developments in this area that have taken place in the second quarter of 2021.

  1. Dissemination of Terrorist Content Online
  2. German court decision further limits T&Cs amendments
  3. UK Online Safety Bill: New Duties of Care for Online Service Providers
  4. Germany Declares Operation of Illegal Online Marketplaces to Be Criminal Offences
  5. New Rules and Uncertainty Under the German IT Security Act 2.0
  6. UK's Product Security and Telecommunications Infrastructure Bill
  7. eBay's New Portal: Power to the Regulator
  8. German Autonomous Driving Act Is Only the Next Step to Self-Driving Cars for Everyone
  9. Algorithms: Potential Harms
  10. Settling Smart Contracts: Digital Dispute Resolution Rules
  11. Looking Ahead
  12. Updates to Topics Covered in Q1, 2021

1. Dissemination of Terrorist Content Online

On 28 April 2021, the European Parliament formally adopted the Regulation on preventing the dissemination of terrorist content online. The Regulation targets content that incites, solicits or contributes to terrorist offences or solicits people to participate in a terrorist group. Most notably, the Regulation will require online platforms to remove terrorist content within one hour of having received a removal order from the relevant national authorities. Online platforms covered by this Regulation will include providers of social media, video, image and audio-sharing services, as well as file-sharing services and other cloud services, insofar as those services are used to make the stored information available to the public at the direct request of the content provider. These online platforms will not have a general obligation to monitor or filter content, but they will have to undertake specific measures to address the misuse of their services where national authorities have established that they have been exposed to terrorist content. We note that content uploaded for educational, journalistic, artistic or research purposes will not be considered terrorist content under the Regulation. While the rules on penalties will be set by individual Member States, the Regulation provides that companies should face fines of up to 4% of their global turnover for persistent non-compliance.

What's Next?

The requirements of the Regulation will come into effect on 7 June 2022. Online platforms should use the next few months to consider how they will robustly address this type of misuse by users, including preparing (and testing the capability of) their internal systems to respond quickly to exposure, and to remove terrorist content from their servers within the one hour timeframe. In doing so, they should also review how they moderate content, more generally, to ensure compliance with existing deletion obligations, such as the national hate speech rules under the German "NetzDG" law, and also to look ahead at what may need changing when the EU's pending Digital Services Act (DSA) is adopted.

2. German court decision further limits T&Cs amendments

Organisations who use consumer-facing terms and conditions (T&Cs) in Europe may have to think again about how they go about updating and changing those T&Cs.

The German Federal Court of Justice found to be unfair (and, therefore, illegal and unenforceable) a common way to make changes to T&Cs used in respect of consumers in Germany.

Germany's strict rules for T&Cs already make most clauses illegal that would allow businesses to make unilateral changes to ongoing agreements (e.g., subscriptions). By this new decision, which was triggered by a set of T&Cs used by German banks, the Court has now invalidated one of the remaining approaches for unilateral amendments that is currently being used broadly across industries: a workaround whereby changes to T&Cs are framed not as unilateral acts by the business, but as mutually agreed amendments, with the consumer's acceptance deemed to be given so long as they do not object within a specified deadline (so-called "tacit consent"). See our full client alert for further details.

What's Next?

Based on the Court's decision, the continued use of clauses such as those now declared illegal will mean a higher risk of enforcement action - by consumers, consumer groups and possibly government regulatory agencies. These types of amendment clauses cannot be relied on as a basis to make binding changes to ongoing agreements, and companies will need to think again about how to make changes. T&Cs for Germany should therefore be reviewed and updated. But the problem goes wider than just Germany: T&Cs are usually drafted to cover all users or consumers globally or Europe-wide and companies don't want to have to adopt one process to amendments for German consumers and a different approach to the rest of Europe. So the question is how to balance the risk of unenforceability in Germany against a more laborious or cumbersome approach applied Europe-wide?

The fairness and transparency of consumer T&Cs also remain on the agenda in the post-Brexit UK, with a recent court decision serving as a reminder that businesses may not always be able to rely on one-sided terms against consumers. See our full client alert for further details.

3. UK Online Safety Bill: New Duties of Care for Online Service Providers

The UK government has published a draft Online Safety Bill (the "Bill") that is designed to tackle illegal and harmful content published online.

Announced in May 2021, the Bill imposes a duty of care on certain online providers to take responsibility for the safety of their UK users. This means that online service providers, regardless of their location, will be affected if they have a significant number of UK users. It also appoints Ofcom (the existing regulator of the UK communications and broadcasting sector) as the regulator, with the power to block sites and also levy fines amounting to the higher of £18 million or 10% of global turnover.

The Bill will require providers of (1) "user-to-user" services and (2) search services to prevent the proliferation of illegal content and activity online. The first category includes companies that allow users to upload and share their own content. Most obviously, this includes global social media companies but, on closer analysis, this category is very broad and may include any provider that hosts user-generated content. The second category includes companies that enable users to search multiple websites and databases. Affected companies will also have to make special assessments and fulfil specific duties where services are likely to be accessed by children in respect of both illegal and harmful content. To meet the duty of care, companies will need to put in place systems and processes to ensure user safety.

Notably, the Bill goes beyond the EU's proposed Digital Services Act by introducing duties of care in relation to harmful (but not necessarily illegal) content.

What's Next?

All companies falling within the scope of the Bill will need to balance their content duties with the duty to safeguard freedom of expression and democratically important content; increased censorship is a key concern of campaigners. After being reviewed by a parliamentary committee, the Bill will be debated by MPs before being approved. See more in our client alert.

4. Germany Declares Operation of Illegal Online Marketplaces to Be Criminal Offences

In June 2021, the German parliament adopted a draft law dubbed the Anti-Darknet Act. The Act targets operators of illegal online trading platforms and their server infrastructure providers with criminal penalties of up to 10 years of imprisonment. These online platforms for drugs, weapons, stolen credit cards, counterfeit currency and more made headlines when police investigations took down DarkMarket (in 2021), Wall Street Market, Dream Market (both in 2019), Hansa and AlphaBay (both in 2017).

Crucially, the new draft law only applies if the trading platform is operated with the purpose of enabling or facilitating an enumerated list of criminal offences. Similarly, the infrastructure provider must intend or know that the infrastructure will be used for this purpose. This protects legitimate online trading platforms, which could covertly be used by criminals.

Legislators expect that the new rules will make it significantly easier to prosecute operators of illegal platforms because law enforcement agencies will no longer have to prove knowledge of a specific illegal act committed via the platform.

What's Next?

The new law will now enter into force before the end of 2021. Once in force, the new rules may also provide a new tool for companies susceptible to brand piracy or counterfeit products. Such companies could strengthen their fact-finding investigations with the help of the police, as the enumerated list of criminal offences covered by the new law not only includes capital offences, drug-trafficking and child pornography but also (computer) fraud, hacking-related offences and infringements of trademarks, geographical indications and registered designs.

5. New Rules and Uncertainty Under the German IT Security Act 2.0

In response to increasing threats of hacker attacks against critical infrastructure systems in the financial, energy, and other key sectors, the German IT Security Act 2.0 took effect in May 2021.

The new rules expand the circle of enterprises subject to specific IT security regulation to include, among others, municipal waste management and certain "special public interest" companies. They also raise the bar of security requirements: Affected companies now have to register with the German Federal Office for Information Security (BSI). Operators of critical infrastructure must implement state of the art "attack detection systems". Future government decisions may prohibit the use of "critical components" of untrustworthy suppliers in such infrastructures. IT security requirements are particularly severe for 5G networks: Critical components used in such networks require prior certification by a recognised body, while 5G network operators face periodic security reviews.

The tougher stance against IT security threats is backed up by an increased maximum fine for violations of IT security obligations of ?20 million (previously ?100,000) and by significantly expanded competencies of the BSI to ensure network security and consumer protection on IT security issues.

What's Next?

Key requirements for applying the new provisions, such as the specification of critical components or criteria to determine "special public interest" companies, remain subject to secondary legislation and thus uncertain. In addition, the German government has published draft legislation further extending the group of critical infrastructure operators regulated by IT Security Act 2.0.

6. UK's Product Security and Telecommunications Infrastructure Bill

New UK legislation will impose minimum security standards on manufacturers, importers and distributors of smart devices to protect against cyber-attacks. The Product Security and Telecommunications Infrastructure Bill aims to boost the security standards of the UK's telecoms networks by placing new legal duties on communication providers to improve their security practices. It will also introduce new powers for the UK government to impose controls on communications providers' use of goods, services or facilities supplied by high-risk vendors. Alongside enforcing the new legislation, Ofcom will be given the responsibility of monitoring and assessing telecoms providers' security. Companies that fall short of the new duties or fail to follow directions on the use of high-risk vendors could face heavy fines of up to 10% of turnover or, in the case of a continuing contravention, £100,000 per day.

What's Next?

Given that the Bill is undergoing its second reading in the UK House of Lords, the key thing to look out for now is the secondary legislation, which will set out the specific security requirements that communications providers must meet. The UK government's announcement confirms an intention to engage with providers on the technical details of the secondary legislation before it's finalised later in 2021, so companies should keep an eye out for upcoming opportunities to discuss their concerns regarding the impact of these rules on their businesses.

7. eBay's New Portal: Power to the Regulator

We have reported before that both the EU (via the Digital Services Act and Digital Markets Act) and the UK (though the establishment of its Digital Markets Unit) are moving towards greater regulation of so-called "Big Tech" companies.

In what could be seen as an instance of a company trying to get ahead of the type of the new European legislation, in May 2021 the global online marketplace eBay announced the successful pilot of its Regulatory Portal. This new mechanism permits selected, trusted authorities efficiently to report listings for illegal or unsafe items to be taken down from eBay - without the need to ask eBay's permission. The online portal complements both eBay's existing customer reporting facility and eBay's own efforts to remove such illegal or unsafe listings. The portal is in a beta phase and will apparently later incorporate functionalities such as seller communication. The UK's Ofcom and Office for Product and Safety Standards are a few of the 50+ approved regulators with access to the portal. Dangerous and counterfeit goods are the obvious targets of such measures.

The portal's innovation may partly be driven by the anticipation of increased digital regulation. The EU's proposed Digital Services Act also suggests a similar concept of "trusted flaggers". Online platforms might be required to set up mechanisms to ensure that reports of illegal content by trusted flagger entities (as appointed by Member States) are prioritised and processed without delay.

8. German Autonomous Driving Act Is Only the Next Step to Self-Driving Cars for Everyone

In May 2021, Germany adopted new legislation that moves towards allowing full access to public roads for autonomous vehicles. This is only a next step, because the new legislation (titled "Act on Autonomous Driving") only permits autonomous vehicles of SAE level 4 - i.e., those that are restricted to operate in certain defined conditions (e.g., "people movers" on pre-defined routes or short-range transportation of goods). Local road authorities must approve these conditions based on individual applications.

However, despite the still-limited roll-out of autonomous vehicles, the new law also lays the groundwork for further steps towards actual autonomy. Instead of drivers, autonomous vehicles will have "technical supervisors" who must be ready to interfere if necessary, which will further limit autonomy, at least for now. Owners of autonomous vehicles must make sure to store certain log data, e.g., in case of irregularities in the operation of the vehicle. Third parties have a claim to access this data in case of accidents with an involvement of the autonomous vehicle. The law also regulates over-the-air updates that activate dormant autonomous driving functionalities.

What's Next?

The new rules are expected to enter into force during summer 2021. However, further legislation will be necessary to allow for fully autonomous mobility on German roads. Also, the new law does not yet regulate the question of who will be liable in case of accidents caused by autonomous vehicles.

9. Algorithms: Potential Harms

Following the publication of its research paper on harms to competition and consumers caused by algorithms in January 2021, the UK Competition and Markets Authority (CMA) ran a consultation requesting input on the paper from market participants. On 7 June 2021, the CMA released a summary of the 27 responses received, which indicated the areas of key concern for respondents, and highlighted a number of suggestions as to the CMA's role and enforcement powers.

Key Concerns

  • Recommender Systems: Some respondents appeared to be particularly concerned about recommender systems and their strong influence over consumers, and suggested that the CMA and Ofcom should interrogate recommender systems within the remit of the Digital Regulation Cooperation Forum.
  • Ranking Systems: Large platforms using ranking systems to restrict access to customers was also a key issue identified by respondents. It is a practice that respondents felt would reduce incentive for service providers to compete for user attention and loyalty.
  • Personalised Pricing: The use of algorithms to establish personalised pricing for consumers was also identified as a potentially problematic practice. Respondents felt that "dominant, technologically advanced firms" would be best placed to reap the rewards of such practices, causing concerns from a competition perspective.

Regulation and Suggestions for the CMA

  • Existing Legislation: Respondents generally felt that many algorithmic harms could be dealt with under existing legislation. To the extent that the existing legislation is ambiguous as to its application to algorithms, the CMA could issue clarifying guidance on the area. Where specific harms cannot be addressed by existing legislation, respondents voiced the need for further consultations to gauge expected market standards and practices before delving into legislating.
  • Disclosure of Information: Respondents suggested that the CMA could require companies to regularly publish statistics on their algorithms, bias levels and false positive and false negative rates. The CMA could also have pre-emptive powers, granting access to an algorithmic system development project's documentation for auditing, or mandate an "ethical audit" as part of the development process.
  • Guidance: Respondents have called for guidance from the CMA on a range of areas, for example, how to show that bias had been found and mitigated against; provision of a clear "safe" list of acceptable algorithmic systems and uses and a "no go" list of unfair or anti-competitive use cases; how harms would be assessed and enforced; and guidance on how to show that guidance had been followed!

What's Next?

The responses are intended to inform the CMA's "Analysing Algorithms Program", which was launched alongside the publication of the paper. Given the commercial value and prevalent use of algorithms, businesses will no doubt be watching this space for further consultations, guidance and papers published as part of this program in the CMA's crackdown on algorithmic harms.

10. Settling Smart Contracts: Digital Dispute Resolution Rules

New rules allowing for anonymity and the speedy resolution of disputes related to novel digital technology were published by the UK Jurisdiction Taskforce of Lawtech in April 2021. The Digital Dispute Resolution Rules are optional rules that can be used for, and incorporated into, on-chain (i.e., occurring on a blockchain) digital relationships and smart contracts. The purpose of the rules is to facilitate the swift and cost-effective resolution of commercial disputes, involving cryptoassets, smart contracts, blockchain and other technologies, using arbitration, or expert determination for expert issues.

Contracting parties can agree to incorporate the rules, usually by including them in a contract or by agreeing to arbitrate under the rules once a dispute arises. This will avoid the need for costly litigation and help parties resolve the matter outside of court. The outcome of any automatic dispute resolution process will be legally binding on the parties.

The rules also contain provisions to allow parties to retain anonymity once they have provided evidence of their identity to a tribunal. This recognises a key motivation of digital assets users. The guidance published alongside the rules suggests and allows for modifications; parties will therefore be able to tailor a dispute resolution solution to their needs. The taskforce will review the rules and their applications next year.

11. Looking Ahead

  • In July 2021, the German Federal Court of Justice is expected to hand down its judgement regarding the legal parameters for social networks to delete content and to block accounts in case of alleged hate speech.
  • Also in July 2021, the German Federal Court of Justice is expected to hand down its judgement in several cases where it is expected to clarify the legal requirements around transparency of influencer marketing.
  • According to a response to a written MEP question, the EU Commission is preparing draft legislation for 2022 to govern (and potentially restrict) end-to-end encryption in digital communications services such as WhatsApp or Signal, also in light of non-harmonised Member State legislation on this topic.

12. Updates to Topics Covered in Q1, 2021

Over the past few weeks, there have been a few developments on things that we wrote about in our Q1, 2021 tracker of Key European Digital Regulation & Compliance Developments.

  • As expected, the EU Commission published its draft for a new AI Regulation and a revamped Machinery Regulation on 25 April 2021. Both proposals are closely interlinked in their attempt to regulate emerging technologies. Public consultations are open until early August 2021. For details on the AI Proposal, see our separate Update.
  • In late May 2021, the German implementation of Article 17 of the revised EU Copyright Directive was adopted and promulgated. It will now enter into force on 1 August 2021. See our separate Update for details. Also, the EU Commission published its guidelines on the proper interpretation of Article 17.
  • The revised German anti-hate-speech bill, the Network Enforcement Act (NetzDG), was adopted in late May 2021 and the respective amendments entered into force on 28 June 2021. The new rules expand NetzDG's scope to video sharing platforms, give users a right to object against enforcement decisions and tighten the screws on transparency obligations.
  • The German implementation of the European Electronic Communications Act was adopted in May 2021 and is expected to be promulgated shortly. It will enter into force in December 2021, almost one year after expiration of the implementation deadline. Other Member States are still working on their implementation as well (e.g., France, Italy and Spain).

We are grateful to the following members of MoFo's European Digital Regulatory Compliance team for their contributions: Jannis Werner, Dominik Arncken, Theresa Oehm, and Alexander Eisenfeld, Mercedes Samavi, and trainee solicitors Georgia-Louise Kinsella and Georgia Wright.

Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Morrison & Foerster LLP. All rights reserved