By Paul Lanois
In March 2022, the European Data Protection Board published its draft “Guidelines 3/2022 on Dark patterns in social media platform interfaces: How to recognise and avoid them“ (the “Guidelines“) for public consultation. While these guidelines were drafted specifically with social media platforms in mind, they can also provide recommendations and guidance that are relevant for the design of any websites and applications given that the existence of a “dark pattern” may constitute a breach of certain GDPR requirements such as consent.
The Guidelines define “dark patterns” as “interfaces and user experiences implemented on social media platforms that lead users into making unintended, unwilling and potentially harmful decisions in regards of their personal data“. According to the Guidelines, dark patterns aim to influence users’ behaviors and can hinder their ability “to effectively protect their personal data and make conscious choices”, for example by making them unable “to give an informed and freely given consent”.
In order to assess whether a design or interface constitutes a “dark pattern”, the Guidelines indicate that the principle of fair processing laid down in Article 5 (1) (a) of the GDPR should serve as a starting point, along with the GDPR’s principles of data protection by design and default, transparency, data minimization and accountability.
The guidelines divide the “dark patterns” into six categories:
- Overloading: Users are presented with too much, for example too much information or too many options or possibilities to prompt the sharing of more data or allow personal data processing against the expectations of the data subject.
- Skipping: Design that presents things in a way that users forget or do not think about data protection aspects.
- Stirring: Design intended to make users base their choices on emotions or visual nudges.
- Hindering: Design that prevents or blocks users from becoming adequately informed of the processing activities, or that makes data management harder to achieve.
- Fickle: Design that makes it unclear or misleading to the user how to navigate data management settings or to understand what the purpose of the data processing is.
- Left in the dark: Design that hides relevant information or data protection settings or to leave users unsure of their data protection and accompanying rights.
For example, the Guidelines provide that when users are in the process of registering themselves and creating an account, any language used that delivers a sense of urgency or sounds like an imperative could have an impact on their “free will” and constitute a “dark pattern”, even when in reality providing the data is not mandatory. As an illustration of this situation, the Guidelines give the following example: “The part of the sign-up process where users are asked to upload their picture contains a “?” button. Clicking on it reveals the following message: “No need to go to the hairdresser’s first. Just pick a photo that says ‘this is me’.” While the Guidelines recognize that the aim of such wording in the signup process is simply “to motivate users and to seemingly simplify the process for their sake (i.e. no need for a formal picture to sign up), such practices can impact the final decision made by users who initially decided not to share a picture for their account“.
These guidelines are subject to public consultation and the final version of the guidelines may therefore differ from the current draft. In any case, these guidelines provide a good insight as to the approach that European data protection authorities are likely to adopt in future investigations.