Thousands of UX professionals are fighting for more intuitive, sustainable, and ethical design principles that will benefit society in the future. What if we shift the recipient of this effort from the consumer to the business? That’s when the dark UX - and dark patterns - come into play.
This article is based on deceptive patterns research conducted by a team led by Dr. Harry Brignull. If you want to learn more about the topic, I recommend reading about the lawsuits, regulations, and other articles about dark UX here.
Table of contents:
- Dark user experience: the roots of dark patterns
- Deceptive Patterns by Harry Brignull
- The hidden dangers of dark patterns: leave it be as it is (+ the consequences)
Dark user experience: the roots of dark patterns
The term "dark patterns" originates from dark UX, a practice of creating deceptive user interfaces to purposefully confuse or trap consumers. Harry Brignull coined it in 2010 to describe user interface design techniques in websites and apps that make users do things they didn't mean to.
According to the original definition, dark UX is defined as:
“A user interface that has been carefully crafted to trick users into doing things…they are not mistakes, they are carefully crafted with a solid understanding of human psychology, and they do not have the user’s interests in mind.”
This sentence summarized Brignull's academic research, which found that many companies use dark patterns to trick users into signing up for subscriptions, giving away private information, or taking other actions that they would not have taken if they had been fully informed (of their consequences). In other words, those deceptive user interfaces aim to steer users in a direction that benefits the company rather than the user.
To raise awareness of their negative impact on consumer choice and decision-making processes, Brignull started to collect evidence of using patterns aiming at misleading consumers on his website DarkPatterns.org. Since then, dark patterns have become more common, with an increasing number of websites and apps employing them to continuously manipulate user behavior.
This trend was confirmed by the recent Bringing Dark Patterns to Light report from Federal Trade Commission, which notes that dark patterns are still widely used to influence user behavior in various industries and contexts, including e-commerce, cookie consent banners, children's apps, and subscription sales. They can be subtle and often go unnoticed, but they can also significantly impact the user experience, harming a company's reputation in the long run.
As we can learn from the official project’s website, which was renamed to deceptive.design, the term was recently updated to deceptive patterns to “reflect a commitment to avoiding language that might inadvertently carry negative associations or reinforce harmful stereotypes."
This change reflects the topic's influence on shaping the conversation around ethical design practices, serving as a solid foundation for multiple regulations worldwide - with a strong emphasis on consumer protection laws.
Deceptive Patterns by Harry Brignull
The scope of dark patterns has grown over time. According to a recent list published on deceptive.design, there are now at least several recognized types. Each pattern deceives users differently, but they all have the same goal: to trick people into doing something they would not have done otherwise.
Let's look more closely at each of these patterns and how they might affect your users.
(Price) comparison prevention
This pattern disrupts the natural decision-making process, making it difficult for users to compare prices against features and personal needs. This in-between confusion is often exploited by using cognitive bias with social proof, the authority bias, or the default effect (we tend to stick with the first choice we've made - or the suggested 'default' option).
A similar effect - known as the "Schwartz paradox" or "Paradox of choice" - can result from the user's struggle to find the best deal among too many options. Being able to choose freely from multiple options may paralyze the user, making the purchase decision rushed or abandoned (depending on the original need). In either case, the user spends a noticeable amount of time trying to match one of the solutions to their needs. This causes even more frustration and an inability to make a decision.
While the tactic prompts sales from a specific product package, users are actively exploited to pump up specific numbers rather than exploring the offer freely, which may cause disruptions in the brand's image and weaken customer loyalty.
The confirmshaming pattern, as the name implies, feeds on the user's emotions by using the most unpleasant ones - such as guilt, shame, or even fear - to persuade a user to take a specific action.
It is typically incorporated into the website/app design as a simple pop-up or opt-out message that strongly resonates with the fear of missing out (FOMO), an emotional response to the belief that failing to perform a specific action will reduce our life quality. The confirmshaming pattern exploits this concept by wording the opt-out communication to make the user feel guilty (or even stupid) if they choose not to take advantage of the offered benefit - thus confirming that they prefer to be ashamed.
Peer pressure is a highly effective persuasion method that actively influences our behavior in social interactions. We want to belong, and if the suggested newsletter, great deal, or subscription brings us one step closer to the "other's level," we're more than willing to comply, ultimately benefiting the service provider. That's why we’re more eager to join a community gathered under a specific name, we want to invest in merch related to the trendy artist, and we definitely need to engage in activities related to the social status we aspire to achieve.
This emotional manipulation is highly unethical, as each interaction with your brand should not evoke negative feelings about your product/service or - especially - your user's integrity.
This design pattern deceives users into clicking on ads by making them appear to be native content or a component of the user interface. Harvesting as many clicks as possible is the apparent logic behind that confusion.
Disguised ads are typically used on websites whose primary source of income is derived from ad impressions, i.e., websites offering a large selection of software on a trial license, music, movies, free alternatives to paid programs, or highly engaging subjects appealing to a general audience like celebrity gossip, recipes, or weather. To increase the chance of getting a click, the design of the ads typically mimics the expected content, including buttons, links, or entire sections.
The practice is highly misleading, as the clicks result from lying to the user. As tempting as the scheme may appear to an aspiring website creator, you should be aware that most of those links contain a tracking algorithm that will follow your tricked user for 30 days with their displays! This is also a bad situation for the businesses that pay for such exposure, as their budgets may be partially spent on unaware, unwilling web users who misclicked a button. And sadly, more and more ads of this type contain malware - leaving the user not only tricked but sometimes also hacked.
Fake urgency, fake scarcity, and fake social proof
Another example of FOMO in action revolves around the limited availability of a product (usually a collection) or a service package. By displaying fabricated pop-ups about the items sold or active users browsing the offer, a wall of generated reviews, a time limit, and low stock information, the seller creates artificial demand for the product with the aim of tricking users into spending money faster.
The goal here is to make the offer more popular, exclusive, and thus more valuable because our natural tendency as humans is to value things that are of common interest (or appear to be).
This dark pattern is intended to turn off our rational thinking when under pressure, allowing users to decide without wasting time comparing similar products that may be of higher quality to them. All they need to do is follow the others. The mechanism is well known in the design of seasonal sales, both in the physical and digital worlds, where colors, visually appealing fonts, and a sense of urgency are all used. Similarly to physical sales, rash decisions increase refund/exchange requests in the following days.
Hard to cancel (roach motel)
This dark pattern traps consumers by making it simple to sign up for a service but extremely difficult to cancel. Unlike the simple, one-click subscription process, opting out usually necessitates several time-consuming steps, including physical ones.
A website, for example, may request that users call a customer service number or even send a signed cancellation printout to a specified address across the globe. Lack of easy cancellation is a pre-designed choice made to make the user think twice (or even: get lost in the layout maze) in the hopes that the default effect will eventually kick in. They ultimately stick to their first choice - agreeing to extend the service for a longer period.
Hidden costs and hidden subscription
Both patterns are designed to attract users with a tempting option or deal, only to replace it with a less appealing variant when the user attempts to proceed.
The seller may hide the actual costs of purchasing until the checkout, which is usually accompanied by a difficult-to-cancel policy. Alternatively, by capitalizing on user commitment, the seller could also silently expand the service's scope or prolong the subscription with no emails or notifications reminding customers that from now on, they are automatically charged (or that new terms to the agreement have been added).
All of this becomes clear only when the user locates the additional deposit from the linked account or attempts to cancel.
Those patterns are sometimes used to draw attention to the brand's social media, which offers special promo codes to cancel the fee. In that case, the code should be visibly described to provide all of the context required to fully understand the updated offer conditions - so that users are not misled into purchasing a one-month discount when they thought they were buying a full-year offer.
Nagging and obstruction
By diverting the user's attention away from their intended goal and toward the seller's goal instead, nagging and obstruction prevent the user from using the service or product. The authors of deceptive.design compares these techniques to "a tax that the provider imposes on users who do not want to comply with the provider's wishes." Users are billed in terms of the time it takes to complete the extra steps (typically scrolling through a long list of permissions or while updating software or profiles) rather than in terms of money.
The end goal is to exhaust the user so they will comply with future requests with no hesitation - or even no reading.
Another dark pattern that capitalizes on the previously mentioned default effect is preselection, which pushes people to go with the option already chosen for them, even if other choices are available. We prefer to accept pre-made choices rather than take action to actually undo them.
This is why, when scrolling through the available variants, consumers frequently do not change the pre-ticked checkbox, particularly if it points to the option in the middle of the spectrum or is marked as the "most popular" or "best offer."
The same rule applies when adding specific groups of "logical" products to the user's basket, such as a warranty for a selected item (which can still be manually removed). Or, when a customer is asked to accept the terms and conditions during the checkout process but chooses to use the "check all" box out of convenience, giving their consent to send promotional materials, add them to marketing lists, or even share their data with third parties.
Visual interference and trick wording
Your website and mobile app should present information logically and consistently, ideally with accessibility in mind. Unfortunately, a lot of things might make using your service in the online world challenging.
Small, low-contrast text can influence how (and if) people perceive the context. This is also true to trick wording. As most users prefer to scan reading, the pattern exploits them by giving the impression that a piece of content is saying one thing when, in reality, it is saying something less advantageous or interesting. When essential information is presented in unexpected ways or places, users' expectations may also be disappointed. For instance, they might be duped into clicking a button they didn't intend to by a website's confusing layout or navigation.
All of the mentioned factors make it easier to conceal, obfuscate, or disguise information, which makes for a highly misleading UX design and a terrible brand experience.
The hidden dangers of dark patterns: leave it be as it is (the consequences)
As you’ve skimmed through the list, you’ve probably recognized several deceptive (or: dark) patterns from everyday web use. You may even ask yourself, if they are so common, why should your business adhere to avoiding those practices?
The reason is simple: by using them, you're breaking your audience's trust.
And the less trust is left, the more severe consequences for your brand reputation are. Actively using dark patterns can decrease user engagement and conversion rates, as customers are less likely to use your website or app. Plus, users who feel deceived or manipulated are more likely to leave negative reviews, pass on their bad experiences to others, and even boycott your brand entirely, severely harming your reputation.
You should also remember that the mentioned rules could leave you with a potential lawsuit as more countries worldwide incorporate dark pattern prevention rules into their own legislation. And if you need more prominent examples, cases related to using dark UX are regularly collected in the Hall of Shame tab, which currently contains nearly 450 lawsuits from around the world.
How do I avoid all that: protect your brand reputation
As a business owner, investing in good UX design can help protect your brand reputation and increase user engagement and conversion rates. By prioritizing user experience and avoiding dark patterns, you can build trust in your brand, creating its positive image.
A well-designed UX strategy should include in-depth user research to understand your audience's behavior and needs and focus on delivering human-centered solutions that prioritize usability and accessibility over revenue. Testing your website or mobile application with real users allows you to identify areas where users may feel confused, misled, or frustrated. This feedback can help you adjust your product or service development strategy to avoid leaning into dark UX practices.
Your final product - either web or app - should provide clear and transparent information, make it easy for users to complete their desired actions, and avoid any tactics that might be deceptive or manipulative. Additionally, it's important to be transparent about any fees or commitments involved in using your service or purchasing your product. As a rule of thumb, if they are any, include them in the short copy on a product/service page - not hidden in terms and conditions!
At Codete, we deliver user-centered design solutions that prioritize usability and accessibility. We conduct user research and use data-driven insights to avoid using dark patterns in your UX design. Each solution is thoroughly tested with real members of your target audience to ensure that the website or app meets their business objectives while providing a positive user experience.
If you need similar services, message us via the contact form or visit our Dribbble profile.