Response to Question 1: Defining Dark Patterns
The term “dark pattern” was initially coined by London-based designer Harry Brignull in 2010 to bring attention to deceptive user design (UX) techniques.1See e.g., Arunesh Mathur, et al., Dark Patterns at Scale: Findings from a Crawl of 11K Shopping Websites, 3 Proceedings of the ACM on Human- Computer Interaction No. 81 at 4 (Nov. 2019) [hereinafter “Mathur et al. (2019)”], https://bit.ly/3bHzVBy; and see Arunesh Mathur, Jonathan Mayer, and Mihir Kshirsagar, What Makes a Dark Pattern… Dark?, CHI ‘21: Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems No. 360 at 2 (May 2021) [hereinafter “Mathur, Mayer, and Kshirsagar (2021)”], https://bit.ly/3fAMsI2. At the same time, Brignull launched Darkpatterns. org in 2010 to name and shame websites engaged in these practices.2Brignull identified 12 variations of what he called dark patterns. A list of those 12 variations, as they appeared when the site launched, is available here: https://bit.ly/2SUUiEG. As initially conceived and executed, this project aimed to use the power of crowds to advocate for ethical and responsible design.
The term has since picked up the attention of other domains and now serves as a dominant framing for a range of problematic social media design. Even in its original form, the term “dark pattern” referenced a class of action, not a specific bad act or practice. It has many meanings and, depending on the context, can reference any number of online techniques.
For example, Mathur et al. (2019) define dark patterns as UX designs that are “coercing, steering, or deceiving.”3Mathur et al. (2019) at 2. This definition stresses the relationship between dark patterns and users, ignoring the role of the UX designer. Similarly, Philip Hausner and Michael Gertz (2021) fix their understanding of dark patterns to “interface designs that nudge users towards behavior that is against their best interests.”4Philip Hausner & Michael Gertz, Dark Patterns in the Interaction with Cookie Banners, Research Gate (Mar. 2021) [hereinafter “Hausner & Gertz (2021)”], https://bit.ly/3fANlQS.
On the other hand, Luguri and Strahilevitz (2021) claim dark patterns are “user interfaces whose designers knowingly confuse users, make it difficult for users to express their actual preferences, or manipulate users into taking certain actions.”5Jamie Luguri & Lior Strahilevitz, Shining a Light on Dark Patterns, 13 J. of Legal Analysis 43 (revised Mar. 29, 2021) [hereinafter “Luguri and Strahilevitz (2021)”], https://bit.ly/3oxHMXE. Luguri and Strahilevitz’s definition of dark patterns focuses on the intention behind the designs. A design choice that only accidentally harms the user would not be considered a dark pattern. Other definitions clinch on terms such as manipulation6 Gregory Day and Abbey Stemler, Are Dark Patterns Anticompetitive?, 72 Ala. L. Rev. 1 ( Jan. 01, 2020). and nudging,7Hausner & Gertz (2021). both of which are vague.
Dark patterns were first written into law in 2020 with the passage of the California Privacy Rights Act (CPRA),8California Privacy Rights Act of 2020, codified Cal Civ Code § 1798.140 (Deering 2021) [operative Jan. 1, 2023]. which amended CCPA.9Id. CPRA, which takes effect in 2023, defines dark patterns as: “a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decisionmaking, or choice, as further defined by regulation.”10Id. §1798.140 (l). The CPRA definition is narrow in its effect, similar to that proposed by Luguri and Strahilevitz, because it requires both a dark UX design and a “substantial effect” on “user autonomy, decisionmaking, or choice.”11Id. By adding the substantial effect element, CPRA avoids grouping in harmless design choices meant to improve the aesthetics of user experience, focusing instead only on UX designs that cause harm.
Federal bills have taken a different tact. The Deceptive Experiences To Online Users Reduction Act (DETOUR Act), introduced in the House of Representatives of the 116th Congress, would have given the FTC authority to regulate specific classes of dark patterns used to subvert user autonomy and informed consent. Although the bill would have outlawed standard A/B testing used to improve user experience,12Deceptive Experiences To Online Users Reduction Act, H.R.8975, 116th Cong. 2d. (2020) (died in committee)(“(B) to conduct a behavioral or psychological experiment, research, or study of users of an online service, except with the informed consent of each user involved.”). it was narrowly tailored in the sense that it included the substantial effect of dark patterns in its definition of problematic UX design.13 Id. (“(A) to design, modify, or manipulate a user interface of an online service with the purpose or substantial effect of obscuring, subverting, or impairing user autonomy, decision making, or choice to obtain consent or user data;”).
The key element driving legislative definitions of dark patterns is the subversion of user autonomy and choice. Similar to the definition proposed by Luguri and Strahilevitz (2021), a UX design is not dark simply because it influences users. A dark pattern must also have the effect of subverting user autonomy. At a minimum, the threshold for defining a dark pattern must be (1) clear nefarious intent to mislead reasonable consumers; and (2) reasonable likelihood of achieving some harm to a reasonable consumer at the benefit of the bad actor.
Therefore, our tentative definition of a dark pattern, rooted in existing federal law, is any act or practice that:
(1) is designed to create a false impression or conceal material facts with the intention to mislead a consumer;
(2) a reasonable consumer14C.f. FTC, 1983 Policy Statement on Deception at 1 (Oct. 14, 1983)[hereinafter “1983 Deception Statement”], https://bit.ly/3v1uPb0 (“If the representation or practice affects or is directed primarily to a particular group, the Commission examines reasonableness from the perspective of that group.”). is likely to be misled by, or is misled by; and
(3) the dark pattern benefits a bad actor or group of bad actors to the detriment of a consumer, meaning it furthered the intended fraud and the bad actor obtained anything of value.15C.f. FTC, Enforcement Policy Statement on Deceptively Formatted Advertisements (Dec. 22, 2015), https://bit.ly/3fCSY15; and c.f. Wolters Kluwer Bouvier Law Dictionary Desk Edition, Fraud (Fraudulent or Defraud) (Lexis 2021).
Response to Questions 3 and 5: Incentives and Prevalence of Dark Patterns
A definition rooted in deception and fraud, a firm legal hook, is far narrower in scope than the expansive definitions described above. Some behaviors called out as being dark patterns by others are not included here.16 E.g., Nagging is neither deception or fraud, nor is it a dark pattern under our proposed definition; but see Allison Hung, Keeping Consumers in the Dark: Addressing “Nagging” Concerns and Injury, Draft at 28 (Mar. 12, 2021) (“Any action brought [under the FTC’s unfairness authority] against a company using nagging to coerce consumers will need to allege a specific consumer injury, usually financial.”), https://bit.ly/3yqRzmS. Gray et al. (2018),17Colin M. Gray, Yubo Kou, Bryan Battles, Joseph Hoggatt, and Austin L. Toombs, The Dark (Patterns) Side of UX Design, UXP2 [hereinafter “Gray, et al.”] (accessed May 20, 2021), https://bit.ly/3bJjaWx. for example, claimed that toying with user emotions via website copy and cute design should be designated as dark patterns. Clearly, enforcement agencies shouldn’t be involved in policing bad aesthetic choices.18Gray, et al. don’t deny the possibility of crimes of fashion. Rather, a distinction needs to be made between dark patterns and the broader category of bad UX design since dark patterns warrant action. Bad UX design, which also includes nudges, shouldn’t be the providence of the FTC and other agencies.
Theory is ambiguous as to the likelihood of companies using dark patterns in their processes. Although dark patterns are often analogized to effluent, they aren’t like pollution in one critical way.19Richard Waters, Critics Raise Alarm Over Big Tech’s Most Powerful Tools, Financial Times (May 16, 2021), https://on.ft.com/3ffrrDH. Pollution is the classic example of an externality, and, as such, typically doesn’t affect the two initial parties. A plant, for example, might produce steel to be used in a building that dumps slag into a nearby river. Farmers down river bear the cost of pollution, which is an externality, not the building contractors or the steel plant, which were the two initial parties.
Dark patterns on the other hand, have costs that are internalized to the interaction between the two initial parties, the user and the platform. If the dark pattern has an impact, it will be shared, or internalized, on the platform. Dark patterns affect the interactions between the user and the operator, not some third party. Incentives suggest that designers and website operators aiming for sustained engagement will in turn shy away from using the methods classified by some as dark patterns. The pressure of these incentives varies, however. For those companies only interested in one-off interactions, perhaps the website sells something fashionable that is expected to fade in popularity over time, the incentive to engage in dark patterns does increase.
Professionals in UX stress the importance of good design,
Using dark patterns is a poor strategy for increasing your website’s conversion rate. Yes, you may be able to generate some sales with them but you’ll be hurting the reputation of your business in the long-term. When customers feel like they’ve been tricked or manipulated, they won’t shop with you again. Your customer lifetime value will fall and there’s a good chance those upset customers will tell their friends about their negative experience.2020 Rudy Klobas, Dark Patterns: The Ultimate Conversion Blocker for Ecommerce Websites, The Good (Apr. 13, 2021), https://bit.ly/3yAJ0pD.
Recent work from CGO’s Experimental Economics Laboratory further highlights the importance of maintaining high quality experiences online. In a voting game that was coupled with online information sharing, researchers found that user engagement on social media is significantly lower when misinformation is permitted. Users posted less and interacted with fewer people when misinformation was allowed to be shared. Among other results, a key finding of this research is that bad experiences are internalized on the platform.
On the other hand, shopping sites and other online experiences not concerned with long term engagement face more nuanced incentives. If a site is focused on repeated sales, then the experience will be ruled by an expectation of future interactions, what game theorists call the shadow of the future.21Peter R. Blake, David G. Rand, Dustin Tingley & Felix Warneken, The Shadow of the Future Promotes Cooperation in a Repeated Prisoner’s Dilemma for Children, Nature (Sept. 29, 2015), https://go.nature.com/3fBW0mc. However, as a site becomes dominated by one-off interactions, the likelihood that they will embed dark patterns increases.22Id.
A large-scale study of shopping websites from Mathur et al. helps to establish baselines for dark patterns.23See Mathur et al. (2019). Their analysis of 53,000 product pages from 11,000 shopping websites uncovered 1,818 instances of a dark pattern on 1,254 websites, “together representing 15 types and 7 broader categories.” The most common dark patterns, according to their definition of the term, included low stock messages (632), count-down timers (393), and activity messages like site visits (264). Assuming that the sites are being truthful about low stocks or visits, none of these behaviors warrant involvement, yet they constitute the bulk of dark patterns at over 70 percent of the total instances.
The findings from Mathur et al confirm the incentive theory suggested above. Worrying design choices are fairly uncommon. Sneaking additional products into a basket (7), hiding costs until the end of a purchase (5), and difficulties cancelling a product (31) best fit our definition of dark patterns, but are all rare occurrences. Combined, dark patterns occurred in 43 websites out of 11,000 or 0.4 percent. The likelihood that any single user will experience a dark pattern is probably even lower than this, given that a small number of sites, like Amazon and Walmart account for a large portion of all shopping interactions online.
Response to Question 9: Next Steps for the FTC
Dark patterns are concerning because they tie deception and fraud, two areas the FTC already enforces. As Mathur, Mayer, and Kshirsagar confirmed, “they omit material information or rely on false information [and] can run afoul of well-established consumer protection laws prohibiting such practices.”24Mathur, Mayer, and Kshirsagar (2021) at 17. Gus Horwitz, similarly writes “most of these patterns involve making representations or engaging in practices that are designed to deceive consumers. Such conduct is covered by Section 5 of the FTC Act’s prohibition against unfair and deceptive acts and practices.”25Justin (Gus) Hurwitz, Designing a Pattern, Darkly, 22 N.C. J. L. & Tech. 57, 95 (2020) (citing 1983 Deception Statement, infra n. 15 at 5 ).
Instead of defining more acts and practices that strike a researcher as dark or malicious based on definitions coined by others, what is needed is a foundation rooted in legal underpinnings.26Mathur, Mayer, and Kshirsagar (2021). Much of the conduct cited by researchers is conduct that is already covered by the FTC’s unfair and deceptive practices standards, legal definitions of fraud, and the Computer Fraud and Abuse Act. Dark patterns are an extension of already illegal conduct online. As such, a clear understanding of dark patterns would combine these legal standards into one succinct definition that excludes practices considered neutral or positive to consumers.
We commend the Commission for its efforts to explore the severity and implications of dark patterns. Still, we caution the Commission from expansively interpreting dark patterns and then enforcing this conduct. As it stands, most scholarship defines the term too expansively, encompassing too many UX designs that are simply not illegal or even worrisome. Still, the term is a useful shorthand for conduct that occurs online and should be enforced, especially since the agency already has the authority to police conduct that is designed to create a false impression or conceal material facts with the intention to mislead a consumer.
To follow up, the FTC should support further research of this phenomena. The Commission’s own Working Paper series has only seen a trickle of output in the last decade. As the regulatory body best suited to regulate dark patterns, the FTC should start with the facts, which it needs to collate and understand.
Thank you for the opportunity to submit these comments on the Commission’s efforts to combat dark patterns online. Please contact William Rinehart at [email protected] for additional information on the work our researchers and scholars perform addressing this issue.
William Rinehart, Senior Research Fellow
Caden Rosenbaum, esq., Technology and Innovation Associate
Amanda Ortega, Graduate Research Fellow