The Story of FeedFair
When dark patterns can no longer stay hidden β and a court imposes β¬100,000 per day
Fictional scenario β based on realistic situations
The Trigger
How it started
FeedFair had a technically working chronological feed option. The problem: every time users closed the app, visited another section, or switched devices, their choice was automatically reset to the algorithmic "For you" feed. The option existed β but was illusory in practice.
A civil rights organization filed an injunction. The court ruled: this is a prohibited dark pattern under the Digital Services Act. FeedFair got two weeks to fix it β or pay β¬100,000 per day. This was the first time a court forced a major platform to make concrete code changes.
"We thought the reset to the algorithmic feed was a design choice. The court called it a prohibited dark pattern."
The Questions
What did they need to find out?
Why is an automatic reset a dark pattern?
The team initially didn't understand why their implementation was problematic. The option existed, right? You could choose chronological, right? The court explained why that wasn't enough.
π‘ The insight
A choice that has to be made repeatedly is not a real choice. If the default setting automatically returns, users are psychologically forced to accept the default. This is exactly what DSA Article 25 prohibits: designing interfaces that "substantially distort or impair" users' ability to make free and informed decisions.
π Why this matters
The court cited Recital 67 DSA which specifically mentions: "making it more difficult to cancel a service than to sign up" and "making default settings difficult to change." The automatic reset fits the same pattern: it's easier to accept algorithmic than to maintain chronological.
What does DSA Article 38 exactly require?
The compliance team went back to the letter of the law. Article 38 DSA requires that Very Large Online Platforms (VLOPs) offer "at least one version" of their recommender system that is "not based on profiling."
π‘ The insight
The key is in "offer" β not just make technically available, but truly offer as a real option. A chronological feed hidden in menus and systematically reset is not "offered" β it is tolerated. The law asks for respect of user choice, not window dressing.
π Why this matters
The court stated that FeedFair's practice "infringes on the freedom of information gathering" of users. This is a fundamental rights frame: the algorithm determines what people see, and if they have no control over it, their autonomy is violated.
How do you implement real user choice?
The development team had to fundamentally redesign. What does "permanent choice" mean technically? And how do you make it "directly accessible"?
π‘ The insight
Permanent choice means: stored as user preference that stays synchronized across all devices and sessions. Directly accessible means: visible on the homepage, not hidden in settings. Best practice: tabs like "For you" / "Following" that are always visible and where your choice stays where you left it.
π Why this matters
Twitter/X implements this with visible tabs. The choice is one click, stays permanent, and synchronizes across devices. FeedFair had to adopt this UX standard β engagement optimization had to yield to user autonomy.
Can civil litigation actually change platforms?
The legal department was skeptical. A national court forcing a global platform to make code changes? That couldn't hold up, could it? They appealed β but the penalty remained.
π‘ The insight
The DSA creates an ecosystem of enforcement: regulators (European Commission), national authorities (Digital Services Coordinators), and civil litigation. Civil rights organizations can sue platforms. Courts can impose penalties. The question "who enforces the DSA?" turns out to be: everyone.
π Why this matters
FeedFair's appeal to "fragmentation of the Digital Single Market" β 27 courts with 27 interpretations β was not accepted. The alternative β that users are completely dependent on regulators β would limit their legal protection. Platforms cannot escape national courts.
The Journey
Step by step to compliance
The summons
A civil rights organization filed an injunction. The claim: FeedFair's automatic reset to algorithmic feed was a prohibited dark pattern.
The verdict
The court ruled sharply: this is a DSA violation. Two weeks to fix, otherwise β¬100,000 per day penalty. Maximum β¬5 million.
The reality check
For the first time, a platform had to not "promise to do better," but actually change code under pressure of a court order.
Interface redesign
Visible tabs on the homepage: "For you" and "Following." One click to switch. Choice stays permanent.
Backend adjustments
User preferences were made persistent: synchronization across iOS, Android and web. No reset at session end.
Broader rollout
Instead of 27 different national implementations, FeedFair chose EU-wide rollout. More efficient and less legal risk.
The Obstacles
What went wrong?
β Challenge
Automatic reset to algorithmic feed was a prohibited dark pattern
β Solution
Permanent storage of user choice, synchronized across all devices and sessions
β Challenge
Chronological option was hidden in menus, not "directly accessible"
β Solution
Visible tabs on homepage, one click to switch
β Challenge
Appeal to "fragmentation of Digital Single Market" was not accepted
β Solution
EU-wide implementation instead of per-country approach
We thought we were clever by offering the choice but not really respecting it. The court saw through the facade. Design choices are compliance choices.
The Lessons
What can we learn from this?
Dark patterns are legally prohibited
Manipulative design is not smart UX β it's a DSA violation with penalties as consequence.
User choice must be real
An option that resets, is hidden, or hard to find, is not an option.
Civil litigation works
Civil rights organizations can sue platforms. Courts can force code changes.
EU-wide is more efficient
Instead of 27 national implementations: roll out compliant design EU-wide directly.
Does your platform use algorithmic content curation?
Check if you offer users a real, permanent choice for non-profiled alternatives.
Ga verder met leren
Ontdek gerelateerde content