Amazon’s roach motel

How a dark pattern becomes a billion-dollar liability

Image of a big ol’ roach motel manager, representing Amazon
Image of a big ol’ roach motel manager, representing Amazon

With the holidays around the corner, I’ve been on an online shopping spree, and recently while on Amazon I stumbled upon a very disturbing trend. Right beside the item’s price, in bright red discount colours there’s a “–25%” discount text. Nothing particularly weird about that… except the “discounted” price was exactly the same as the price I’d seen a few days earlier. Using a price tracker, you can actually see that the listing was briefly restocked at a higher price purely to enable the 25% discount badge. The UI is screamin’ you’re getting a bargain, you’d be silly not to grab it now, while nothing has materially changed except my perception.

Back in the 1800s, shopping was more akin to a cat-and-mouse game, where prices were subjective and based on the individual shopkeeper’s whims. If demand picked up for a particular item, they could increase or discount. This all changed when a family known as the Quakers (of Quaker Oats fame) implemented what they called the “one price” option. They started posting prices for things publicly, which allowed consumers to compare across stores, and that created a new kind of equilibrium and the humble price tag was born.

But in the age of the internet, the virtual shopkeepers have a lot more information to leverage against you. A thing called dynamic pricing, the algorithmic reincarnation of old-school haggling, was invented. Explicitly designed to discover the highest price at which a sale can actually be made. As companies increasingly shared data and responded automatically to competitor listings, it has become easier than ever to keep prices in lockstep in favour of businesses and not consumers. These days, it’s not uncommon to find digital price tags updating in real time to match market conditions for optimal profit extraction. We’re back to fluid, opportunistic pricing, but now on steroids and supercharged by algorithms.

At the forefront of this new wild west of pricing and manipulation is Amazon. They use every method at their disposal to coax us into spending a few extra pennies, which when you have 310 million active users globally, is not too shabby. Dark patterns: design choices that push people toward decisions that benefit the business at the user’s expense, are one of those tactics that I wanted to explore.

Amazon leans heavily on a pricing tactic known in consumer-protection circles as fake sales and discounts which essentially is inflating the “original” price so the current one looks like a bargain. In UX, this dark pattern doesn’t have an official name, though it clearly falls under interface interference and misdirection, but for the sake of this article, I’m gonna coin it Bezos Bargaining: the illusion of a “deal” created entirely through framing. Amazon presents a higher “list price,” your brain treats it as the baseline, so everything that follows feels like savings, even when nothing is actually being discounted.

Image showing the fake discount beside the listing price of the Amazon Fire TV’s
Image showing the fake discount beside the listing price of the Amazon Fire TV’s

Recently, there was a class-action lawsuit, where the plaintiffs showed that a bunch of Amazon products, including its own Fire TV, were advertised with massive markdowns even though they technically were just the “original” prices. Price-tracking data even showed examples where the “sale” price was higher than what the product cost weeks earlier.

Another of my fav’s is the classic roach motel, which Amazon applied liberally to the Prime subscription flow. Aptly named after the exterminator trap, it’s easy to get in, nearly impossible to leave. It makes joining Prime seamless and canceling into an IQ and endurance test.

Signing up unintentionally is way too easy. At the checkout you’re presented with a giant, high-contrast button offering FREE delivery with Prime (often preselected by default). One click and you’re in a 30-day trial, whether you like it or not. Once you’re in, getting out becomes an odyssey. The cancellation flow forces you through a labyrinth, first a page asking you to “confirm,” then another urging you to pause instead and another warning you about the benefit’s you’ll lose, followed by the final, guilt-soaked screen offering a discounted rate if you stay. It’s a hall of mirrors designed to intentionally disorient. Just to add insult to injury, Amazon reportedly referred to this cancellation obstacle course as Project Iliad, named after the ancient epic about a war that took a decade to escape…

Image of one of the many screens in the Amazon Illyad cancellation flow showing all the ways their trying to make you stay
Image of one of the many screens in the Amazon Illyad cancellation flow showing all the ways their trying to make you st

Recently Amazon lost a landmark case, because of this design strategy and we’re ordered to pay a staggering $2.5 billion settlement. The largest consumer protection lawsuit in history, and it hinged on the well-known dark pattern Roach Motel. What makes this case historic isn’t the hefty dollar amount. It’s the precedent! For the first time at this scale, a court treated UI and UX decisions like button placement, flow ordering, default selections or the strategic use of friction as tools of deception. Dark patterns are officially becoming a legal liability!

Also, it’s important to acknowledge that these dark patterns, especially for large tech companies aren’t purely the responsibility of the designers. More often than not, the UX designer doesn’t get to cherry-pick the problems to solve and Amazon is famously data-driven. It’s easy to imagine how analytics could nudge a team towards a pattern like this. We’ve all been in grey zones, but that’s why one of the most useful things we can do is arm ourselves with strong arguments.

There’s a company called Fairpatterns, a legal entity created specifically to build frameworks around deceptive design, and they make a compelling business case against dark patterns. The way they frame it isn’t just an ethical plea; they use business language. By saying that dark patterns are short-term optimizations with long-term costs. And while it might cause a spike in conversions for a quarter or two, it also might erode trust, reduce customer lifetime value and, as the latest Amazon case showed, expose you to legal risk.

FairPatterns graphic showing the longterm consequences of using dark patterns

Their framing prompts us to ask stakeholders good questions like:

  • What’s the cost of losing user trust?
  • How does unintentional retention affect lifetime value?
  • What are the risks of doing this to our brand’s equity?
  • and finally, even though unlikely, could this flow get us sued?

And if the stick doesn’t work, there are always the carrots! It’s not a fact that every major corporation MUST use dark patterns. There are major companies that actually leverage their respect for their users as a sales strategy. Apple’s privacy initiatives show how pro-user defaults can become strategic differentiators. Basecamp, for example, has spent years publicly advocating for opinionated, user-respecting design in their books Getting Real, REWORK, and It Doesn’t Have to Be Crazy at Work. They explicitly champion simplicity, honesty, and avoiding manipulative tactics, including making it easy for people to leave! And their product reflects those values. Even Spotify, which is currently very much on my naughty list for unrelated reasons, allows for fairly painless cancellation, I recently learned.

There are lots of examples of companies that don’t require the use of dark patterns to succeed. They treat user trust as an asset that compounds over time. It isn’t necessarily an ethical argument, for them its a pragmatic business strategy! And while not every company is going to be sued for $2.5 billion, the next time a client or manager asks you to implement a dark pattern, maybe send them this New York Times article and see what they think.

Screenshot of New York Times Article about Amazon’s 2.5 Billion Dollar Claim Settlement

PS: Don’t actually do that unless your boss is extremely chill. It might not be well received. 😂


Amazon’s roach motel was originally published in UX Collective on Medium, where people are continuing the conversation by highlighting and responding to this story.

Need help?

Don't hesitate to reach out to us regarding a project, custom development, or any general inquiries.
We're here to assist you.

Get in touch