Who has never found themselves clicking on a huge green button underneath the privacy settings, convinced that this would validate their choices, only to realise that they had just, against all odds, accepted all the default settings? Who has never found themselves, on each visit to the same website, in front of complex cookie (sub)windows, even longer than the page visited? Or who has never given up on the idea of deleting their account on a social network at the 5th stage of the tedious process? This has probably happened to everyone, so much so that it has inspired a mini game of refusing cookies as quickly as possible.
On 14 March 2022, the European Data Protection Board ("EDPB") published the 3/2022 guidelines on dark patterns on social networks, the non-transparent practices of influencing or even forcing users to make decisions about their privacy or rights.
In these new guidelines, which are currently under to public consultation and will therefore still be subject to change, the EDPB classifies and illustrates a range of dark patterns. Although the phenomenon can in principle take place not only on social networks but also on other platforms and websites, the EDPB addresses these guidelines explicitly to social network designers, as well as to their users to enable them to better spot such practices.
Dark patterns are likely to violate various provisions of the GDPR
This assault on dark patterns comes at the crossroads of most, if not all, of the general principles of the GDPR. Indeed, a social network interface or user experience that encourages the user to make a more invasive decision about their personal data can lead to the following:
- a lack of transparency (e.g. if the explanation text displayed contradicts itself),
- a flawed consent (e.g. if consent is repeatedly requested) or not sufficiently withdrawable (e.g. if withdrawal requires many clicks as opposed to the consent itself),
- a violation of the data subject's rights (e.g. if a link to exercise a right merely redirects to generic information),
- or a violation of the principle of protection by design and by default (e.g. if the most invasive options are selected by default).
More generally, dark patterns will, as the EDPB likes to recall, lead to a breach of the fairness principle of the GDPR, which is "an overarching principle which requires that personal data shall not be processed in a way that is detrimental, discriminatory, unexpected or misleading to the data subject". Finally, all these principles are supplemented by the accountability principle, according to which it is up to the controller to demonstrate compliance. Ultimately, these new guidelines are imbued with the stated aim of the GDPR to ensure data protection for data subjects by placing them at the heart of the management and effective control of their personal data.
Furthermore, the EDPB goes so far as to draw a parallel with consumer law, pointing out that the provision of incomplete information may additionally constitute a violation of consumer law (e.g. misleading advertising).
What is the impact for social network designers and other data controllers?
These ambitious guidelines clearly have "GAFAM" and major online platforms in their sights, in line with recent interventions by the European legislator such as the Data Act or the Digital Services Act package. They seem to go one step further than previous ones, in that they do not just apply the principles of the GDPR to typical, one-off processing activities (which could be considered as "textbook cases"), but to an entire eco-system. This includes not only consent to specific activities, but also the way in which all privacy options are presented - including audience settings when the user posts content - and, more generally, the entire user experience on the social network on a day-to-day basis.
Moreover, these new guidelines require a greater balancing effort on the part of the controller. Thus, for example, the data subject must be duly informed in advance but, at the same time, a privacy policy that is too exhaustive and inundates them with information may be qualified as a Privacy Maze or a Look over there and may be contrary to the requirements of transparency, in particular as regards the concise and intelligible nature of the information provided. Similarly, consent must be collected in a specific and granular manner (and not "bundled" with other services or purposes), but the controller must be careful not to overwhelm the user with options, otherwise it will constitute a practice of Too many options. It is therefore a case-by-case and sometimes delicate examination.
Dark patterns can occur at all stages of social network use
The guidelines are structured chronologically and follow the "life cycle" of a social network account.
The dark patterns and applicable principles are thus illustrated through several use cases, starting with the registration on the platform and ending with the deletion of the account, including the provision of information at the beginning of use, the communication of a data leak, the provision of privacy parameters, and finally the exercise of rights by the person concerned.
At each of these stages, the EDPB also provides a set of best practices to help social network designers achieve compliance at the development phase.
The EDPB classification of dark patterns and its examples
Let's get to the heart of the matter: in the 64 pages of the guidelines, the EDPB distinguishes about fifteen practices that constitute dark patterns, divided into six main categories developed for the occasion. The EDPB also differentiates between content-based and interface-based dark patterns.
The table below contains all the dark patterns identified and defined by the EDPB, as well as the different examples related to them (very similar examples have been merged).
This classification is not perfectly watertight and, as privacy professionals are used to, is rather a tool to be applied on a case-by-case basis to identify risky practices. A single practice may thus correspond to several dark patterns.
Overloading
"Burying users under a mass of requests, information, options or possibilities in order to deter them from going further and make them keep or accept certain data practice." (EDPB)
This includes:
| Continuous Prompting |
Repeatedly asking users to provide more data or agree with new purposes, regardless of the choice already communicated by the user. Examples:
|
| Privacy Maze |
Obtaining information or exercising data subjects' rights is a "treasure hunt", so that users will probably give up. Examples:
|
| Too many options |
Providing too many options to choose from, leaving users unable to make a choice. Examples:
|
Skipping
"Designing the interface or user experience in such a way that users forget or do not think about all or some of the data protection aspects." (EDPB)
| Deceptive Snugness |
By default, the most invasive features and options are enabled in an effort to take advantage of the default effect. Examples:
|
| Look over there |
Providing irrelevant or unnecessary information to distract users from their initial intent. Examples:
|
Stirring
"Affecting the choice users would make by appealing to their emotions or using visual nudges." (EDPB)
| Emotional Steering |
Using reassuring or negative words or images to influence the user's emotional state and prevent them from making a rational decision. Examples:
|
| Hidden in plain sight |
Using a visual style for information or data protection controls that nudges users towards less restrictive options. Examples:
|
Hindering
"Hindering or blocking users in their process of obtaining information or managing their data by making the action hard or impossible to achieve." (EDPB)
| Dead End |
An information is impossible to find because a link is either not working or not available at all. Examples:
|
| Longer than necessary |
It takes more steps to disable privacy-invasive options than to enable them. Examples:
|
| Misleading information |
There is a discrepancy between information given and the actions available. Examples:
|
Fickle
"The design of the interface is unstable and inconsistent, making it hard for users to figure out where the different controls really are and what the processing is about." (EDPB)
| Lacking hierarchy |
Information related to data protection is presented several times in several ways. Examples:
|
| Decontex-tualising |
A data protection information or control is located on a page that is out of context. Examples:
|
Left in the dark
"The interface is designed in a way to hide information or controls related to data protection or to leave users unsure of how data is processed and what kind of controls they might have over it." (EDPB)
| Language discontinuity |
Information related to data protection is not provided in the official language(s) of the country where users live, whereas the service is. Examples:
|
| Conflicting information |
The pieces of information contradict each other in some way. Examples:
|
Other aspects that deserve attention
- The EDPB once again confirms the prohibition to deny access to a service in case of refusal of consent, and thus its position on the illegality of cookie walls.
- A point that is likely to be debated during the public consultation is the sending of a code by SMS message for security reasons. The guidelines seem to be reluctant to use this method of two-factor authentication, which is currently considered by experts to be less secure than other methods. Some will nevertheless point out that it is probably better to have this option activated than not to have a second factor feature at all. Even so, the EDPB sees the very use of this technique, and thus the collection and use of the phone number (which is "not that easily interchangeable"), as a potential violation of the minimisation principle, since alternatives exist. One can speculate that this is because of the finding that some controllers use security reasons to collect telephone numbers and then use them for other purposes. In such cases, however, the purpose limitation principle applies, so that use for other purposes would in any case be prohibited.
Incidentally, it is questionable whether this issue is really about dark patterns and whether it belongs in these guidelines.
- Although the EDPB repeatedly refers to, and even encourages, the use of "?" signs or icons providing more information (which is not surprising, as this method of visual information is promoted by the GDPR), he emphasises that the text displayed upon clicking or hovering should indeed contain more detailed information, not superfluous information or prompts.
- More generally, although the GDPR rules invoked in the guidelines apply to data controllers in all sectors, the EDPB has chosen to focus on social media platforms and their users only. Is this a convenient first step in an enforcement strategy - in which case all other sectors and websites will eventually be targeted - or does it mean that social networks have a greater responsibility - and if so, on what basis? It is probably true that social networks pose a relatively high risk to privacy, but some platforms in other sectors pose as much or more risk.
Want to know more?
The guidelines (version 1.0) were adopted in March 2022 and the public consultation will remain open until 2 May 2022.
Originally published 05/04/2022
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.
[View Source]