The Regulatory and Ethical Landscape of Dark Patterns in E-Commerce | MetaBoard
The Regulatory and Ethical Landscape of Dark Patterns in E-Commerce
The use of Dark Patterns has become a disturbingly common practice among e-commerce platforms. In response to this issue, the Central Consumer Protection Authority formulated the Guidelines for Prevention and Regulation of Dark Patterns, 2023. This blog explores the concept of Dark Patterns and offers a detailed analysis of these Guidelines.
The phenomenon of “Dark Patterns” within e-commerce applications and websites constitutes a significant challenge to consumer autonomy and ethical trade practices. These patterns are characterised as malicious and deceptive user interface [“UI”] and user experience [“UX”] practices on e-commerce platforms, deliberately designed to manipulate users into making decisions they would not typically make. Usually, such decisions serve to the detriment of users and the benefit of the companies that use such patterns.
These patterns use deception to undermine, impair, or interfere with a consumer’s decision-making or choices, thereby amounting to misleading advertising, unfair trade practices, or violations of established consumer rights.
WHAT EXACTLY ARE DARK PATTERNS?
Globally, there has been a growing consensus on defining and prohibiting these techniques. A foundational explanation was provided by researchers and subsequently highlighted in journalistic reports. For example, Sara Morrison's well-known analysis for Vox News described a dark pattern as a website design intended to subtly manipulate or heavily influence users into making unintended choices.
In her analysis, she also provided the example of application pop-ups [such as Instagram’s pop-ups] that request permission from users to create a ‘more personalised experience’ by tracking cross-device activity. This design language, while framed as a user benefit, serves to mask a broad data-tracking mechanism [tracking and targeting a user’s activity across their device] that extends beyond the user’s immediate intent.
An image of such a pop-up from Instagram is attached below:
The gravity of dark pattern use is further highlighted by significant enforcement actions taken against major technology firms. These cases serve as clear precedents demonstrating regulatory efforts to penalise manipulative design practices.
A notable example from 2023 is the U.S. Federal Trade Commission [“FTC”] initiating action against Amazon. The FTC claimed that Amazon used various dark patterns to deceptively sign-up consumers into its self-renewing Amazon Prime subscription. The complaint alleged that Amazon intentionally made the purchase process without a Prime membership complicated, used misleading language to trick consumers into subscribing during checkout, and, critically, established a deliberately complex and challenging process for cancelling the membership. This case specifically addresses the dark patterns of Forced Continuity and Obstructing Opt-Out.
Preceding this, in 2022, the issue of privacy dark patterns was brought to light by the Google lawsuit concerning alleged deceptive location tracking practices. This lawsuit claimed that Google kept tracking user locations on Android devices even after users disabled location services. It further contended that the privacy settings were deliberately made hard to find and configure, demonstrating an example of Interface Interference intended to undermine user consent and privacy preferences.
Under these Guidelines, ‘dark patterns’ are explicitly defined as “any practices or deceptive design pattern using user interface or user experience interactions on any platform that is designed to mislead or trick users to do something they originally did not intend or want to do, by subverting or impairing the consumer autonomy, decision making or choice, amounting to misleading advertisement or unfair trade practice or violation of consumer rights.”
This definition explains dark patterns comprising three parts:
i. any practices or deceptive design patterns [that use UI or UX interactions] created to mislead or trick users into performing actions they did not initially intend or desire;
ii. by undermining or impairing the consumer’s autonomy, decision-making capabilities, or informed choice,
iii. thereby constituting misleading advertising, unfair trade practice, or a violation of established consumer rights.
This comprehensive definition describes dark patterns as intentional, deceptive design tactics that manipulate users into performing actions and making decisions they would not otherwise. This definition then links such unscrupulous design tactics to existing legal violations by classifying them as misleading advertisements, unfair trade practices, or other breaches of consumer rights. By linking dark patterns to established legal categories, regulators gain a clear and actionable framework for enforcement, shifting the burden onto businesses to ensure that their digital interfaces are based on the principles of transparency and fairness.
Falsely stating or implying a sense of scarcity or urgency to mislead a user into making an immediate purchase.
A website displays, “Only 2 seats left at this price!” when, in reality, the price and seat availability are static for a long period.
2.
Basket Sneaking
Automatically inserting extra charges, products, or optional services into a consumer’s shopping cart or final order summary without their explicit consent or agreement, which leads to a higher final payment amount.
A customer adds a t-shirt to their cart, and the system automatically adds a ₹50 donation to a charity that the user must manually remove.
3.
Confirm Shaming
Using emotionally manipulative language, visuals, or tone intended to induce feelings of guilt, shame, or ridicule in users so as to coerce or “nudge” the user into proceeding with a purchase, agreeing to a request, or preventing them from selecting a preferred opt-out choice.
When declining a discount offer, the opt-out button reads, “No thanks, I prefer paying full price” or “No, I don’t care about saving money.”
4.
Forced Action
Forcing a consumer to take an action that requires them to purchase additional, unrelated goods, subscribe to an unrelated service, or share personal information to complete a desired transaction.
A user must sign up for a weekly email newsletter before they can proceed to download a free e-book or view the price of a product.
5.
Subscription Trap
Designing a complex, lengthy, or confusing cancellation process for a paid subscription. This often includes automatically imposing recurring charges after a ‘free’ trial period ends without the user’s explicit, clear consent for renewal.
A user can sign up for a service in one click, but to cancel, they must navigate five different menus and then call customer service during limited hours.
6.
Interface Interference
Manipulation of design elements to emphasise certain information while simultaneously obscuring or reducing the prominence of other crucial, relevant details so as to misdirect the user away from their preferred or intended action.
The “Accept All Cookies” button is large, bright green, and immediately visible, while the “Reject All” or “Manage Preferences” link is grey, tiny, and hidden below the fold.
7.
Bait and Switch
Advertising a particular result based on the customer’s choices, only to deliberately deliver a different, lower-quality, or pricier alternative after they commit to the action.
A user clicks on an ad for a cheap flight to Mumbai, but when they land on the booking page, that cheap flight is unavailable, and only significantly more expensive options are shown.
8.
Drip Pricing
Concealing mandatory charges upfront and only revealing them late in the transaction, leading to a higher final price.
A ticket platform displays a ticket price of ₹2,000 until the payment page, where the final price is displayed as ₹2,300, because mandatory ₹300 ‘Platform Fee’ is suddenly added.
9.
Disguised Advertisement
Presenting advertisements as another form of content altogether [organic, non-sponsored material] such as user reviews, regular news stories, or standard articles.
This is done to deceive users into believing the content is impartial or editorially independent, prompting them to click or engage.
An article promoting a specific health supplement uses the same font and layout as the site’s independent news stories, lacking any visible “AD”, “Advertisement” or “Sponsored” label to disclose it is paid content. The user clicks, believing it is an unbiased news report, not a promotion.
10.
Nagging
Disrupting or annoying users with repetitive and persistent requests, information, or notifications, unrelated to the intended purchase, to effectuate a commercial gain.
A user repeatedly closes a prompt to upgrade to a premium account, but the pop-up reappears every 30 seconds or after every page click, regardless of their current activity.
11.
Trick Question
Intentionally using ambiguous and/or unclear wording [like using double negatives or mixing up “do not” and “uncheck”] to confuse the user, thereby, steering them into taking an unintended action.
A checkbox reads: “I accept the terms and conditions and agree to not opt out of future marketing.” Users focusing on accepting the terms may fail to notice they are also giving away their right to opt out of marketing later.
12.
SaaS Billing
Generating recurring payments in a Software as a Service [SaaS] model on the pretext of a subscription or purchase, without the user’s explicit consent for such recurring charges.
A consumer purchases a digital photo editing software license advertised as a one-time cost of ₹5,000. They complete the transaction, but the platform immediately begins processing a separate, undisclosed ₹500 monthly “Cloud Sync Fee” that the user was not clearly prompted to opt into during the initial purchase.
13.
Rogue Malwares
Employing malicious software or deceptive pop-ups that mislead or trick users to believe that their device has been infected with a virus or is otherwise insecure, so as to frighten the user into paying money for an entirely fake or useless security program.
A pop-up appears with flashing red text and an urgent tone claiming, “Your device is infected! Click here to download our security software now for just ₹1000,” when no actual threat exists.
WHAT THE 2023 GUIDELINES SIGNIFY
These Guidelines represent a highly significant and mandatory intervention by the Indian regulatory framework into the ethics of digital commerce by specifying a zero-tolerance approach to digital manipulation. They are applicable to all platforms that systematically offer goods or services in India, as well as to advertisers and sellers of such goods and services.
The Guidelines not only provide a comprehensive definition of dark patterns but also specify and explain 13 categories of such patterns [as explained above]. The detailed explanations of these prohibited practices also eliminate ambiguity. Thus, it provides clear standards for digital businesses [including advertisers and sellers].
This framework is legally binding and ensures its strict implementation by linking these prohibitions to the CCPA’s enforcement powers. These include penalties such as fines, imprisonment, and mandatory cease-and-desist orders. This represents a crucial global move towards increased design transparency and the protection of consumer dignity in the digital economy. As a result, digital interfaces must now actively promote informed decision-making, instead of exploiting cognitive biases for commercial gain.
In line with this, they reinforce the principle of consumer autonomy as a fundamental right by specifically targeting manipulative practices that exploit ordinary consumers, ensuring that the digital design process must now prioritise transparency and fairness over commercial exploitation.
In furtherance of these Guidelines, the CCPA established a Joint Working Group in June 2025 to urge e-commerce platforms to comply with the Guidelines. Additionally, in July 2025, the CCPA released a further advisory encouraging platforms to perform self-audits to detect and eliminate dark patterns. This Advisory specified that these audits be completed and the necessary steps for the elimination of such dark practices be taken within three months of its issuance. It also recommends that platforms issue self-declarations based on their audits, specifying that their platforms do not engage in dark patterns in order to ensure a fair digital environment and build trust between consumers and e-commerce platforms.
In addition to this, the Consumer Protection [E-Commerce] Rules, 2020, prohibit all unfair trade practices by marketplaces. In fact, they make it necessary to inform users of the fees/ other charges payable, the method of payment and the security thereof. The concealment of these is prohibited, with violations treated as offences under the Consumer Protection Act, 2019. The Digital Personal Data Protection Act, 2023 [“DPDPA”], with its requirement for free, clear and informed consent, also establishes a powerful mechanism to challenge deceptive opt-in and consent designs.