Design, Dark Patterns and the Digital Services Act

You may not realise that when you enter a supermarket, the products are purposefully arranged in a specific order or placed in a specific location. There is a commercial reason why certain items such as fruits or pasta are located in the first aisle, or perhaps in the most hidden place. There is also a reason why chocolate is close to the payment checkout.  This is all about influencing the shopper to purchase items by using design to exploit consumer bias. The payment checkout is an area of the store which everyone must go through in order to pay, and therefore almost everyone who enters the store, regardless of what they are there for, will see the items on display. If you are tired after a long day and having walked in the grocery store, your body may crave something sweet. Noticing a chocolate bar or a soft drink in a well-placed location or eye-level influences us to purchase it, particularly if our body is craving it.

Moving away from supermarkets, subscription services displayed on web pages work in similar ways. Potentially, the lack of important information in a well-displayed manner (such as making caveated information in small text at the bottom of the page or within the terms and conditions) was the reason why you subscribed to a service without realising that actually only the first month was free. If so, at some point there will be an unpleasant surprise on your credit card statement that the payment of the second month has already been taken.

With these examples in mind, design can adapt to the intention of the seller and can influence the decisions of consumers. The same applies to the digital world: design will also influence your decisions online. This is why your attention is sometimes easily caught with attractive colours and images, or due to the size, placement and presentation of words. With design, there is the concept of “dark patterns”.

Where does the term come from?

The term “dark patterns” was coined by the user experience (UX) designer Harry Brignull in 2010. Brignull describes dark patterns as “tricks used in websites and apps that make you do things that you didn’t mean to, like buying or signing up for something”

What are the types of dark patterns?

On 26th October 2022, the Organisation for Economic Co-operation and Development (OECD) published a report on dark commercial patterns. The effectiveness of dark patterns is more evident with the combination of different techniques and when they are applied in mobile phones. This is common in e-commerce webpages and applications of online platforms, in cookies consent notices or even in online games.

For consumers, dark patterns influences them in broadly the following categories:

  • Forced action: forcing the disclosure of more personal data;
  • Interface interference: options that appear more favourably to the businesses through visual prominence;
  • Nagging: continuous repetition of requests to change to a setting that benefits the business;
  • Obstruction: making more difficult to cancel the service or subscription;
  • Sneaking: adding non-optional charges to a transaction only in its final stage;
  • Social proof: by notifying evidence of other consumers’ purchasing activities;
  • Urgency or a sense of urgency: for example, countdown timers indicating that the sale has a limited duration of time.

Dark patterns may utilise different design-based elements, such as the use of single or multiple screens, pop up dialogues, variation in colouring and the prominence of options, and the use of emotive or aggressive language. The problem here is that dark patterns rely directly on cognitive mechanisms to influence consumers and the exploitation of human biases.

Regulatory approaches around the world:

In the US, the California Privacy Rights Act (CPRA), passed in 2020, provided the first legal definition of dark patterns, “a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice, as further defined by regulation”  (CPRA/ Cal Civ. Code paragraph 1798.140(l)).

In the EU, the Digital Services Act (“DSA”) sets new obligations for online platforms and aims to reduce harms, strengthen the protection for users, as well as to establish a new accountability and transparency framework. The DSA entered into force on 16th November 2022. Among the new obligations are, for instance, user-facing transparency on online advertising for online platforms and for very large platforms. The DSA refers to dark patterns on online interfaces as: “practices that materially distort or impair, either on purpose or in effect, the ability of recipients of the service to make autonomous and informed choices or decisions. Those practices can be used to persuade the recipients of the service to engage in unwanted behaviours or into undesired decisions which have negative consequences for them.”

The issue is focused on exploitative design choices through different kinds of components described further above in this article. It is true that when people have less or impaired choice, their autonomy is at risk. In this sense, dark patterns affect user’s autonomy, choice and decision-making by changing the structure, design and the functionalities of the online interface. However, the DSA also considers that the rules preventing dark patterns do not mean that providers cannot interact directly with users or cannot offer services. In advertising, for instance, there are legitimate practices in compliance with the law that do not constitute dark patterns. The DSA also states that the prohibited practices of dark patterns will fall under the scope of the DSA if they are not already covered by the Unfair Commercial Practices Directive, or by the GDPR.

However, there are criticisms to regulatory and enforcement measures, such as possible gaps in the law when certain types of dark patterns are not captured sufficiently by the prohibitions on deceptive commercial practices. Another criticism is that the transparency solution may be not enough by itself to avoid the damaging effects. For instance, providing information to the users on the exploitative design elements – a transparency based approach – may not be as effective compared to restricting the practices itself. Users can be informed but dark patterns may remain having stronger cognitive influences in their behaviour. These concerns are intensified by the collection of data by businesses as well as the application of machine learning techniques. Depending on the dark pattern used, there are risks of financial loss, psychological effects and privacy issues. The following section will focus on the privacy implications:

Data protection concerns:

Under the UK GDPR, there is a requirement that data protection should be “by design and by default” (Art. 25). The EDPB provided recommendations for the design of the interfaces. It states that social media providers are controllers and have responsibility for the design and operation of social media platforms. As dark patterns can violate consumer protection regulations and data protection regulations, the infringements imposed by data protection and consumer protection regulators can overlap. For the EDPB, dark patterns are divided in content-based patterns, which relates to the content and wording of the sentences, and the interface-based patterns, related to the ways of displaying the content, navigation and interaction.

Examples of content-based patterns:

Overloading  – continuous prompting: when users are asked to provide more personal data than it is necessary; they are asked to provide additional data, for instance, when a social media provider asks the user in every log in to provide his/her telephone number. The users will usually get overloaded and will provide the personal data.

Hindering – misleading information: for instance, that users need to provide mobile phone number to use the service, when the number is actually not necessary. In fact, to complete the sign in process there are other ways, such as scanning a QR code, for instance.

Stirring and emotional steering: influence the emotional state, making the user feel positive, safe or negative and anxious. This process can make the user share more personal data when actually this is not necessary to use the service, contrary to the data minimization principle [1].

Hindering or longer than necessary: when the user experience requires the completion of many steps. This dark pattern will influence the user who will go back and insert the request data.

Examples of interface-based dark patterns:

Stirring – hidden in plain sight: when the visual style for data protection controls nudge users toward more invasive options.

Skipping – deceptive snugness: the interface or user experience is designed in a way to make users forget about the data protection aspects. In deceptive snugness, most data features and options are enabled and individuals may keep a pre-selected option.

Hindering – dead end: it happens when users are trying to obtain information or manage their data, however, the action is impossible or hard to achieve. With “dead end”, users do not find the information they were looking for and there is no available link.

In these and other cases, when the consent collected to process data is influenced by the aforementioned practices, it fails to meet the “informed” and “freely given” consent requirement [2]. It is also clear that the exercise of rights and the easy access to information are not respected. Furthermore, the practice does not respect the clear and plain language requirement which refers to the “concise, easily accessible and easy to understand” and when there is use of visualisation. [3]

Another example is the use of dark patterns on “preselection”, “hidden information” with the intention to influence consumers to privacy-intrusive settings by Google. For instance, it was found that dark patterns influenced privacy-intrusive designs on location history [4] as well as enabled location tracking by default, deceptive click-flow, and hidden information regarding making an informed choice about the tracking.

Conclusion:

Following the DSA’s entry, online platforms will have until 17 February 2023 to report the number of active end users, which will indicate the designation by the European Commission as very large online platform or search engines.  This will indicate which companies must have to comply with the regulation as smaller companies will have a reduced set of obligations and exemptions. With this decision by the Commission, the entity will have to comply with the obligations and carry out an annual risk assessment exercise in 4 months. The DSA will be applicable by 17 February 2024.

Regarding measures to address dark patters, some are around regulatory and enforcement measures and a consumer-friendly choice architecture [5]. The choice architecture is a clear indication of when design data protection principles meet real-life technical tools. Besides, there is a growing need to apply and consider design standards under an ethical approach and an incentive for businesses to conduct a self-audit and self-assessments of choice architecture. Our analysis and monitoring of the impacts of the DSA and the approach undertaken by businesses will continue over the next year.

[1] Article 5(1)(c) UK GDPR.

[2] Article 7 UK GDPR and Article 4(11) UK GDPR.

[3]Recital 58 – The Principle of Transparency – General Data Protection Regulation (GDPR) (gdpr-info.eu)

[4] Forbrukerrådet (2018), Every step you take.

[5] OECD. Dark commercial patterns. Digital Economy Papers. October 2022. No. 336

Share:

More Posts

Send Us A Message