It’s happened to all of us. You’re browsing online and a pop-up appears, blocking the entire screen and urging you to enter your email in exchange for 10% off. You look for the “X” to close the window, but there isn’t one. Instead, the only way you can resume browsing is to cough up your personal information or click a tiny hyperlink that says, “No thanks, I don’t like saving money.”
These are dark patterns. The regulators of the Federal Trade Commission (“FTC”) are watching them, and like Queen Victoria before them, they are not amused.
Coined in 2010, the phrase “dark patterns” refers to an interface designed to trick or manipulate users into making a particular decision, one that causes harm or is borne of confusion. As an agency charged with preventing unfair and deceptive acts and practices, the FTC is understandably concerned about the widespread use of dark patterns.
In that vein, the FTC published a staff report titled, “Bringing Dark Patterns to Light,” in September 2022. The report defines in granular detail what dark patterns look like in practice, dividing them into eight primary categories and giving concrete examples of each. Whether you’re operating a business with an online presence or underwriting one as a merchant acquirer, it makes sense to understand how regulators define these “dark patterns” and avoid implementing devices that, correctly or not, are perceived by regulators as contributing to consumer confusion.
The first dark pattern on the FTC’s list is that of endorsements or social proof. Some unsurprising examples include fake celebrity endorsements or glowing customer testimonials that omit, for example, key information, like that the reviewer owns the company or has been paid to post. The FTC has also identified some less intuitive variants, such as false activity messages (“39 other people are viewing this item”) and “parasocial relationship pressure.” The latter refers to the use of familiar characters that children know and trust to push them toward making in-app purchases.
A second dark pattern is scarcity, which includes false messages about low supply and high demand. For example:
- “Only two left in stock — order now,” when stock is plentiful.
- “28 people have added this to their cart,” when only two people have.
The third dark pattern named in the FTC’s report is urgency, a phenomenon marked by false countdown timers or “limited time only” messages.
Fourth is obstruction, a pattern embodied in practices like making cancelation of an order or subscription difficult. Sometimes obstruction manifests in creating “immortal accounts” that are difficult or impossible to delete.
Fifth is the transparently named sneaking or information hiding. Examples include automatically adding items to a cart without the user’s permission, hiding material information in fine print or behind hyperlinks, or obscuring costs and fees only to surprise the consumer with them at checkout. Another variant of “sneaking” seized on a psychological phenomenon with which casinos are well familiar: requiring consumers to buy things with a virtual currency or tokens to obscure the real cost in dollars.
Sixth, the FTC’s report calls out interface interference. Variants include creating a “false hierarchy,” for example, by presenting “keep my subscription” in large letters and “cancel my subscription” in small grayed-out font below. Other variants include disguised ad content or even the old bait and switch — such as when clicking the “X” in the top right corner of the pop-up takes the user to that page rather than closing the window.
The seventh dark pattern is coerced action. Variants include nagging (e.g., repeated requests for an email address), forced enrollment, and “grinding.” Grinding is the practice of making the free version of an app so cumbersome that the user is compelled to unlock new features with in-app purchases.
The final dark pattern on the FTC’s list is asymmetric choice. Variants include confusing or trick questions, often with double negatives — for instance, “uncheck the box if you are unopposed to not receiving emails.” Another iteration of asymmetric choice is “confirm shaming” in which customers can opt out of providing personal information or initiating a subscription online only by selecting an option that says, “I don’t like saving” or “No thanks, I’d rather pay full price.”
Dark patterns are ubiquitous across the internet. It bears mention, however, that not all of the content messages above are necessarily “dark” if the information presented is true. If stock really is low, the sale really does end at midnight, or users really are piling items in their carts, then it is not unfair or deceptive to say so. However, when these announcements are false or used to confuse or mislead, they may run afoul of the law and garner the FTC’s attention.