In a time where fake news is so prevalent, the topic of dark patterns has gained a lot of relevance. The term was purely academic but, in recent times, has been commonly seen in data protection regulations worldwide.
Indeed, this term was coined in 2010 by Harry Brignull, a British UX designer, who even created a website on the topic. He also describes it as deceptive design, a term equivalent to dark pattern.
Dark patterns are interface designs that attempt to trick, coerce, or pressure users into taking certain actions, for example, buying or signing up for something. To this end, the strategies used in such design may contain unequal alleged benefits in the available choices or even false, misleading or hidden statements, inducing inappropriate choices or behavior contrary to private data protection laws.
Dark patterns are much more common than one would think and often practiced by large companies, either national or multinational. For example, they may use a popup consent form on their website with a button saying “Accept all cookies” for the user to click on. Such prerogative occurs without the option for the user to reject or even choose which Cookies are acceptable.
Another example often found on newspaper and magazine websites are paid advertising materials, in which misleading advertising boosts sales of a given product by noting outlandish, untrue characteristics.
Harry Brignull’s website lists several types of dark patterns, which are transcribed below:
The issue is viewed with great concern and seriousness by authorities responsible for consumer law and private data protection. In March 2022, the European Data Protection Board (EDPB) launched the Guidelines on deceptive design patterns in social media platform interfaces, which teaches how to recognize and avoid dark patterns.
These guidelines clarify that the General Data Protection Regulation (GDPR) relies on the principle of fair treatment established in Article 5 (1) (a). It serves as a starting point for assessing whether a design does indeed constitute a “dark patten design”. Other principles in this assessment are those of transparency, minimization of data, and responsibility under the terms of (c) and (2) of the same Article, as well as the purpose limitation under (b). Furthermore, in other cases, the legal assessment may also be based on conditions of consent in Article 4 (11) and (7) or other specific obligations, such as Article 12.
It is not a coincidence that, in January 2022, Facebook and Google were severely punished with considerable fines within the European Union for using dark pattens in their cookies consent process, which were considered confusing and difficult to understand. Other examples of defining and regulating unclear standards can be found in US laws, most notably the California Consumer Privacy Act (CCPA), the California Privacy Rights Act (CPRA) and the Colorado Privacy Act (CPA).
While the CCPA does not explicitly mention dark patterns, the concept emerged during its regulatory process. In the Final Statement of Reasons, a document produced to accompany draft regulations, the California Attorney General stated that: “It would run counter to the intent of the CCPA if websites introduced choices that were unclear or, worse, employed deceptive dark patterns to undermine a consumer’s intended direction.” The CPRA, on the other hand, defines dark standards such as “a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice, as further defined by regulation.” Finally, the CPA defines dark patterns standards as a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice, a definition almost identical to that of the CPRA. The CPA takes effect in July 2023, while the CPRA takes effect in January of the same year.
As such, five principles that must be observed by companies are introduced:
- Ease of understanding – should be applied to both the language used and the design itself.
- Symmetry in choice – exercising a “more privacy” option should not be any longer or more difficult than a “less privacy” option.
- No confusing language or coercive interactive elements – Double negatives are a clear example of confusing language, i.e. placing “Yes” or “No” options next to “Do not sell or share my personal information”.
- No manipulative language or choice architecture – for example, offering a discount or other financial incentive in exchange for consent is considered manipulation.
- Easy to Run – There should be a fully functional, built-in consent managing tool that can facilitate opt-outs without adding friction. Other examples to avoid are circular links or inoperative and non-functional e-mail addresses.