Dark patterns are top of mind for regulators on both sides of the Atlantic. In the United States, federal and state regulators are targeting dark patterns as part of both their privacy and traditional consumer protection remits. Meanwhile, the European Data Protection Board (EDPB) is conducting a consultation on proposed Guidelines (Guidelines) for assessing and avoiding dark pattern practices that violate the EU General Data Protection Directive (GDPR) in the context of social media platforms. In practice, the Guidelines are likely to have broader application to other types of digital platforms as well.

What are Dark Patterns?

The EDPB defines the term dark patterns as including "interfaces and user experiences implemented on social media platforms that cause users to make unintended, unwilling and potentially harmful decisions regarding the processing of their personal data." The Guidelines identify six types of practices, along with detailed examples, that raise concerns in regard to the design of social media user interfaces and information content made available to users. These include:

  1. Overloading– confronting users with a large quantity of requests, information, options, or possibilities in order to prompt them to share more data or unintentionally allow the processing of their personal data in unexpected ways – including practices like continuous prompting, privacy maze and too many options;
  2. Skipping– designing the interface or user experience in such a way that users forget or overlook data protection concerns or options – including practices like deceptive snugness and look over there;
  3. Stirring– practices affecting user choice by appealing to their emotions of users or using visual nudges – including tactics like emotional steering and hidden plain sight;
  4. Hindering– setting up processes that make it difficult or impossible for users to inform themselves or take action to manage their data – including practices like dead end, longer than necessary or use of misleading information;
  5. Fickle– making it difficult for users to navigate data protection control tools or to understand the purpose of processing by designing user interfaces that are confusing or unclear – including lacking hierarchy and decontextualizing pattern types.
  6. Left in the dark– implementing interfaces designed to hide from users privacy control tools or information about how their personal data is being processed, or that leave them uncertain about how their data is being processed and/or their rights in this regard – including practices like language discontinuity, conflicting information and ambiguous wording.

There are also numerous examples outside the social media context and which US regulators now have in focus. An online provider employing dark patterns may make the process for purchasing a subscription online relatively easy with a short check-out/purchase flow, but establish a complex, multistep flow process, online or offline, for cancelling the subscription which involves forcing customers to consider different offers designed to prevent them from un-subscribing. Several US states have updated their automatic renewal/negative option laws in order to address these issues. Another area of concern is the use of website terms and conditions, along with privacy policies, that are not transparent and designed to limit consumer choice. For example, obtaining "consent" from a user by formatting website terms and privacy policy disclosures in ways that are difficult to read (e.g., using a light colored font on a light colored background) coupled with an acceptance button that does not make it clear what the consumer is consenting to —such as a button titled "Purchase."

Regulatory Developments in the US

Regulators in the US have long targeted unfair and deceptive practices that are designed to nudge or manipulate consumers in certain ways. As similar practices are being deployed in the digital world, regulators are taking note and action.

In September 2020, Former Federal Trade Commission (FTC) Commissioner and current Director of the Consumer Financial Protection Bureau (CFPB) Rohit Chopra issued a statement on the use of dark patterns in an enforcement action against an online subscription service for making it difficult to cancel recurring subscription fees. In the statement, Commissioner Chopra noted: "Dark pattern tricks involve an online sleight of hand using visual misdirection, confusing language, hidden alternatives, or fake urgency to steer people toward or away from certain choices." Director Chopra is continuing to investigate allegations of digital dark patterns while at the helm of the CFPB.

Federal enforcement agencies, regulators, and State Attorneys General have multiple tools with which to combat dark patterns. For example, the Restore Online Shopper's Confidence Act includes a requirement to include clear and conspicuous disclosures of material terms prior to obtaining the customer's billing information. California's updated automatic renewal law now requires business to provide consumer a way to cancel a subscription purchased online, online, without having to go through extra steps that obstruct their ability to terminate the renewal plan. Several other states have updated their automatic renewal/negative option laws within the past year to increase the obligation on businesses to provide ample notice to consumers subscribed to these plans.

State privacy laws are also targeting dark patterns. The California Privacy Rights Act (CPRA), which will be fully operative on January 1, 2023, targets dark patterns in the process for opting out of the sale and sharing of personal information, among other areas. Importantly, consent obtained through dark patterns does not constitute "consent" under the CPRA. Dark patterns are also addressed in the Colorado Privacy Act (CPA), which specifically defines "consent" as not including an "agreement obtained through dark patterns." And of course, the FTC Act and the mini-FTC Acts enforced by the states establish broad powers for the relevant agencies to regulate unfair or deceptive acts and practices.

Dark patterns are also an issue being addressed by advertising technology self-regulatory agencies in the United States, like the Network Advertising Industry (NAI), which has previously covered regulator action on dark patterns and recently published guidance for its members on the topic. Our team attended the NAI Summit last week and dark patterns featured prominently in the discussions on notice and consent.

Regulatory Developments in Europe

Concerns regarding the use of dark patterns to discourage users from exercising their rights to privacy were raised in a detailed study entitled "Deceived by Design," published by Norway's consumer protection agency in June 2018. In January 2021, the UK Competition and Markets Authority (responsible for competition and consumer protection) published a paper entitled "Algorithms: How they can reduce competition and harm consumers," which identifies dark practices as an area requiring collaborative regulatory oversight with the UK Information Commissioner's Office going forward.

The consultation for comments on the EDPB Guidelines closed on May 2, 2022, and the final guidelines are expected to be issued in the next several months. In addition to cataloging the dark pattern practices that have come to the attention of supervisory authorities in relation to each phase of a social media user's account lifecycle, the Guidelines analyze the GDPR provisions that these practices may violate. These include the fairness and transparency principle (Art. 5(1)(a)), the accountability principle (Art. 5(2)), data protection by design and default (Art. 25), the requirement to provide transparent privacy notices to data subjects (Art. 12(1), 13 & 14), and the various rights accorded to data subjects under the GDPR (Art. 15-22). The Guidelines also identify best practices for the design of user interfaces that will facilitate the effective implementation of the GDPR.

Implications for Digital Marketing Practices Going Forward

In order to reduce the risk of regulatory sanctions and the potential for consumer class actions, online platforms and publishers should be mindful of the increasing focus on dark patterns by US and European regulatory authorities. Businesses should evaluate their current practices and ensure that marketing and website interface design teams are aware of the regulatory risks and expected best practices. Information on material terms and conditions should be clear, conspicuous and complete. The language should be direct and not used in a way to pressure or manipulate consumers into making predetermined choices. Fair and transparent disclosures should also be presented when obtaining consumer consent. Privacy policies and website terms should accurately describe the online provider's current data practices in a manner that is understandable to the typical consumer. Opt-in and opt-out flows should clearly disclose what consumers are opting in and out of, require a similar number of steps (i.e., not make it harder to opt out than to sign up), and be easily accessible to consumers.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.