Go back

What used to be “just bad UX” is now officially a legal risk.

Dark patterns—those deceptive design choices that nudge people into taking actions they might not freely choose—are now banned across the European Union. The Digital Services Act (DSA), effective since 2024, explicitly prohibits a range of these tactics. And it is not alone. The GDPR, AI Act, Digital Markets Act (DMA), and Consumer Rights Directive are all reinforcing one principle: your design must respect user autonomy.

If you design digital products in or for the EU, here is what that means, and what needs to change.


Common Dark Patterns Now Banned

Let’s take a closer look at some patterns that are no longer allowed, with examples you may recognize from everyday products.

1. Fake Urgency

Example: “Only 2 seats left!” or “Offer expires in 30 seconds”
Any alert that creates false urgency—like countdowns not tied to real inventory—is now considered manipulative. The DSA requires that urgency must reflect actual availability, not serve as psychological pressure.

2. Tricky Opt-Outs

Example: Signing up takes one click, but cancelling means navigating multiple steps and emails
Known as the “roach motel,” this pattern makes it easy to enter a service but frustrating to leave. EU law now requires that leaving a service must be as easy as joining, without barriers or buried options.

3. Pre-Ticked Boxes

Example: A checkbox for “Partner offers” is selected by default
Under GDPR, consent must be clear and freely given. That means no pre-checked boxes. Users must make an active choice.

4. Buried Key Information

Example: Details on pricing after a trial are hidden in footnotes
Important terms—especially those that affect cost or renewal—must now be visible upfront. Hiding them in small print is no longer acceptable.

5. Emotional CTAs

Example: “Still struggling to land clients? Let’s talk.”
When call-to-actions are designed to play on fear, doubt, or other vulnerabilities, they may now fall under emotional manipulation. The AI Act and DSA target these subtle forms of pressure.

6. Guilt-Tripping UI

Example: “Are you sure you want to cancel? We’ll be heartbroken.”
Messages that rely on guilt or shame to influence users are now under scrutiny. Designs must support informed, voluntary decisions—not emotional coercion.

7. Disguised Ads

Example: A blog post that looks like editorial content but is actually paid promotion
The DSA now requires all advertising to be clearly labeled. Disguising promotions as content puts you at risk.

8. Confusing Navigation

Example: Account deletion is hidden under “Help” → “Settings” → “Other”
Making essential actions hard to find—especially those related to privacy, payment, or cancellation—is now a compliance issue. Navigation must be intuitive and fair.

Other Laws Reinforcing the Ban on Dark Patterns

Several EU regulations reinforce these principles:

  • GDPR: Prohibits forced or unclear consent mechanisms, including pre-ticked boxes and hard-to-find opt-outs.
  • AI Act: Bans emotional profiling and manipulation, especially when it targets vulnerable groups.
  • DMA: Focuses on dominant platforms and bans self-preferencing, default-locking, and other exploitative tactics.
  • Consumer Rights Directive: Requires clear disclosures on pricing, subscriptions, refunds, and cancellation processes.

What You Should Do Now

If you work in design, product, growth, or legal, now is the time to audit your UX. Even small, legacy elements of your product — like how you ask for an email, or how users cancel a plan — can trigger legal exposure under the new rules.

Need a Compliance Check? 

We help organizations assess and improve their digital experiences in line with EU regulations. From the DSA to the GDPR, our audits highlight risks and suggest practical improvements—so your teams can deliver compliant, respectful, and user-friendly products. Discover our services and contact us to ask all your questions, we’ll be pleased to help!