On October 2, 2025, the Amsterdam District Court issued a landmark ruling that Meta Platforms violates the Digital Services Act by failing to effectively give users the choice for a chronological, non-profiled timeline on Facebook and Instagram. Meta has two weeks to remedy this or faces a penalty of €100,000 per day.
Precedent Case: For the first time, a national court forces a Big Tech platform to concrete implementation changes under the DSA through civil litigation. This marks a new era of platform regulation where user autonomy becomes legally enforceable.
The Ruling: Meta Must Respect User Choice
In summary proceedings brought by digital rights organization Bits of Freedom, the presiding judge of the Amsterdam District Court ruled on October 2, 2025, that Meta Ireland Ltd. violates the Digital Services Act (DSA) in how Facebook and Instagram handle user preferences for timeline display.
The core of the problem is simple yet fundamental: although Meta has been required since February 2024 to offer users a choice for a non-profiled timeline (pursuant to Article 38 DSA), the company makes this choice illusory in practice by systematically reverting to the algorithmically curated "recommended" feed.
What the Court Specifically Ruled
The court concludes that Meta's current implementation violates the DSA on multiple points:
Automatic reset of user choice: Whenever a user closes the app, navigates to another part of the application, or switches between desktop and mobile, the chosen chronological timeline is automatically reset to the algorithmic feed. The court qualifies this as a prohibited "dark pattern" under Article 25 DSA.
Limited accessibility: The option for a chronological timeline is hidden in menus and submenus, instead of being directly and easily accessible from the homepage and in sections like Reels. This contradicts the obligation to offer users a real, meaningful choice.
Violation of information freedom: The court rules that Meta's approach "infringes on the freedom of information gathering" of users. By systematically returning to an algorithmically curated feed, Meta limits users' autonomy to determine for themselves how they want to receive information.
Quote from the ruling
The court calls the automatic reset to the algorithmic feed a "prohibited dark pattern" that "harms the autonomy and freedom of choice of users of these platforms" and "contradicts the purpose of the DSA".
The Concrete Order to Meta
Meta Ireland Ltd. must implement the following adjustments within two weeks after service of the judgment for Dutch users of Facebook and Instagram:
- Direct and easily accessible choice for a non-profiled timeline on the homepage and in the Reels section
- Permanent preservation of user choice that does not automatically revert to the algorithmic feed when closing the app, navigating to other sections, or switching between devices
- Transparent presentation of the choice option without manipulative design elements
Non-compliance triggers a penalty of €100,000 per day, with a maximum of €5 million.
The Legal Framework: DSA Obligations for Very Large Online Platforms
To understand the significance of this ruling, it's essential to grasp the underlying DSA obligations. Meta is designated as a "Very Large Online Platform" (VLOP) under the DSA, meaning the company must comply with an elevated regime of obligations.
Article 38 DSA: Obligation to Provide Non-Profiled Recommendations
Article 38 of the Digital Services Act requires VLOPs and Very Large Online Search Engines (VLOSEs) to offer users at least one version of their recommender system that is not based on profiling as defined in the GDPR.
| Requirement | DSA Provision | Meta's Violation |
|---|---|---|
| Offer non-profiled option | Art. 38(1) | Option exists technically but is systematically undone |
| Respect user choice | Implicit in Art. 38 | Choice is automatically reset |
| No dark patterns | Art. 25 | Hiding option and automatic reset |
| Transparent interface | Art. 27 + Recital 67 | Option hidden in menus, not directly accessible |
Profiling is defined in the GDPR as "any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person". This includes analyzing or predicting aspects such as behavior, interests, location, and movements.
A chronological timeline, by contrast, simply shows posts from accounts the user follows in reverse-chronological order, without behavioral analysis or predictive algorithms.
Article 25 DSA: Prohibition of Dark Patterns
Article 25 of the DSA prohibits providers of online platforms from designing, organizing, or operating their online interfaces in a way that deceives or manipulates users, or otherwise materially distorts or impairs their ability to make free and informed decisions.
Recital 67 of the DSA provides concrete examples of dark patterns:
- Repeatedly requesting a user to revisit a choice they have already made
- Making it more difficult to cancel a service than to subscribe
- Making default settings difficult to change
- Misleading users by enticing them into certain transactions
Legal definition of dark patterns (Recital 67 DSA): Practices that "materially distort or impair, either on purpose or in effect, the ability of recipients of the service to make autonomous and informed choices or decisions".
The Amsterdam court applies this definition to Meta's systematic reset mechanism and concludes this is a classic example of a dark pattern: making it more difficult to maintain a certain choice than to accept the default.
What Meta Must Concretely Change
The court gives Meta very specific orders that must be implemented within two weeks. Let's translate this into concrete product and interface changes.
Current Situation vs. Required Situation
| Aspect | Now (DSA Violation) | Required (After Ruling) |
|---|---|---|
| Choice Accessibility | Hidden in Settings → Feed (multiple clicks deep) | Directly accessible from homepage and Reels section |
| Choice Persistence | Resets when closing app, switching sections, changing devices | Permanently saved regardless of app usage |
| Default Setting | Always algorithmic feed ("For You") | User can set chronological feed as default |
| Reels Section | Only algorithmically curated | Choice option directly accessible there too |
| Transparency | Unclear that choice is temporary | Clearly communicate that choice is permanent |
Practical Implementation Requirements
1. Interface Adjustments
Meta will likely need to add a persistent choice element to the navigation bar or main menu of Facebook and Instagram. Think of a toggle switch or tab selection that remains visible during app use, similar to how Twitter/X implements this with "For You" vs. "Following" tabs.
2. Backend Modifications
The user choice must be stored as a persistent user preference in Meta's backend systems, remaining synchronized across:
- Different devices (iOS, Android, web)
- App sessions (even after force-close)
- Different sections within the app (Feed, Reels, Stories, etc.)
3. Chronological Feed in Reels
This is technically challenging, as Reels are inherently designed around algorithmic content discovery. Meta will need to implement a mechanism where Reels from followed accounts are shown chronologically, which may mean less content is available for users who follow few accounts.
4. Geolocation-Specific Implementation
The ruling applies only to Dutch users. Meta will thus need to implement geolocation-based feature flags, or roll out these changes EU-wide (which would be more efficient but has broader business impact).
Meta's Defense and the Jurisdictional Question
Meta has announced it will appeal the ruling, with a fundamental argument that reaches far beyond this specific case.
Meta's Argumentation: Threat to Digital Single Market
In its response, Meta states: "We fundamentally disagree with this decision. According to us, this concerns the Digital Services Act and should be handled by the European Commission, not by individual courts in EU member states. Proceedings like this threaten the digital single market and the harmonized regulatory regime that should underpin it."
This argument touches on an essential tension in DSA enforcement: national enforcement versus harmonized supervision.
The Jurisdictional Question
May a national court in civil proceedings force a VLOP to comply with the DSA, or is this exclusively reserved for the European Commission as supervisor? This question has fundamental implications for DSA enforcement throughout the EU.
Analysis: National Enforcement and DSA Architecture
The DSA has a differentiated enforcement model:
For VLOPs and VLOSEs: The European Commission is the primary supervisor (Article 56 DSA) and has exclusive powers to impose fines and enforce compliance.
National enforcement: Article 51 DSA stipulates that member states appoint Digital Services Coordinators (DSCs) that supervise compliance. In the Netherlands, this is the Authority for Consumers and Markets (ACM).
Civil litigation: The DSA does not explicitly exclude that national courts in civil proceedings can establish DSA violations and impose injunctions. This is precisely what the Amsterdam court has now done.
Meta's argument suggests that allowing national civil litigation would lead to fragmentation of the digital single market, because different courts could reach different conclusions about the same practice. This would lead to 27 different interpretations of what a "dark pattern" is, or what "easily accessible" means.
Tension: The DSA aims for harmonized enforcement, but civil litigation is inherently nationally fragmented. How this is resolved can fundamentally influence all DSA enforcement.
On the other hand: if civil litigation were excluded, users and civil rights organizations would be entirely dependent on supervisors to enforce DSA compliance. This would limit their legal protection and could conflict with the right to effective legal protection (Article 47 Charter of Fundamental Rights EU).
Interim Conclusion Despite Appeal
Important to know: an appeal against a summary judgment does not automatically suspend its execution. Unless Meta successfully requests a suspension (which requires a separate procedure), the company must implement the changes while the appeal procedure runs.
This means Dutch users will likely be able to set a permanent chronological feed within two weeks, regardless of the appeal.
The Electoral Context: Why Timing is Crucial
The court explicitly points to the proximity of the parliamentary elections on October 29, 2025, as a factor for the short two-week deadline. This is not a random detail but touches on fundamental questions about algorithmic content curation and democratic information provision.
Algorithmic Curation and Electoral Influence
Modern recommender systems on social media platforms largely determine what political information users see, and in what order and frequency. This has multiple problematic effects on democratic processes:
Filter bubbles and echo chambers: Algorithms optimize for engagement, meaning they show users content that confirms what they already think. This reinforces political polarization and limits exposure to diverse perspectives.
Amplification of emotional content: Studies show that algorithms systematically prefer emotional, controversial, and extreme content because it generates more engagement. This can lead to radicalization and deterioration of public debate.
Opaque curation: Users have no insight into why they see certain political content and not others. This opacity makes it difficult to make informed decisions about information consumption.
External influence: Algorithmic systems can be manipulated by coordinated campaigns, bots, and disinformation networks, with the algorithm further spreading this content without human oversight.
Chronological Feed as Democratic Safeguard
A chronological timeline offers users transparency in information provision: you see what accounts you follow publish, in the order they publish it. This restores user control over information sources.
During elections, this is particularly relevant: voters can consciously decide which political parties, journalists, and commentators they want to follow, and can trust they will actually see that information - not filtered by a black-box algorithm.
The court acknowledges this by explicitly stating that Meta's practice "infringes on the freedom of information gathering". This concept - information freedom - is fundamental to democratic decision-making and is linked by the court to the ability to choose a non-algorithmic feed.
Precedent Effect and Broader Implications
This ruling is precedent-setting for multiple reasons and has implications reaching far beyond Meta and the Netherlands.
First Successful Civil DSA Enforcement
This is the first time a national court in civil litigation forces a VLOP to concrete implementation changes under the DSA. Previous DSA enforcement came primarily from:
- The European Commission via formal procedures against VLOPs
- National supervisors (Digital Services Coordinators) via regulatory interventions
- Voluntary commitments by platforms under public pressure
The role of civil litigation, with civil rights organizations as plaintiffs, opens an entirely new enforcement route. This is particularly powerful because:
- Low-threshold access: NGOs and users can relatively quickly and affordably file summary proceedings
- Fast injunctions: Summary procedures lead to quick rulings (here: within weeks) with direct penalties
- Public visibility: Lawsuits generate more media attention than administrative supervision procedures
- Jurisprudence building: Court rulings create precedents that other courts can follow
Bits of Freedom as civil society enforcer
This case demonstrates the power of civil society enforcement: civil rights organizations acting on behalf of users to legally challenge platform behavior. This model can be repeated for other DSA violations and other platforms.
Possible Domino Effects in Other Member States
Although the ruling is legally binding only in the Netherlands, it has potential precedent effect in other EU member states:
Similar lawsuits elsewhere: Civil rights organizations in other countries can start similar summary proceedings, referring to the Dutch ruling as precedent. A German, French, or Spanish court may decide to follow the Dutch reasoning.
EU-wide implementation by Meta: Instead of developing 27 different national implementations, Meta may decide to implement these changes EU-wide (or even globally). This is technically and operationally much more efficient.
Pressure on European Commission: The ruling increases pressure on the Commission to intensify its own DSA enforcement. If national courts force platforms to comply, this underscores shortcomings in central enforcement.
Standard-setting for "dark patterns": The court provides a concrete interpretation of what a "dark pattern" is in the context of recommender systems. This creates jurisprudence that other courts and supervisors can use.
Impact on Other Platforms
Although this ruling specifically concerns Meta, the principles are directly relevant for all platforms with algorithmic content curation:
TikTok: Uses a very powerful algorithm without easy option for chronological feed. Vulnerable to similar lawsuits.
YouTube: Offers a "Subscriptions" feed but systematically pushes users toward the algorithmic "Home" feed. Possibly similar DSA violation.
X (Twitter): Has "For You" vs. "Following" tabs but also regularly resets to the algorithmic feed. Although better accessible than Meta, possibly still not DSA-compliant.
LinkedIn: No real chronological option, fully algorithmically curated. Possibly VLOP status (depending on user numbers in EU).
These platforms will closely follow this ruling and possibly proactively implement changes to prevent similar lawsuits.
Practical Consequences for Organizations and Platforms
This ruling has direct compliance implications for any organization operating platforms or implementing recommender systems.
Checklist for Platforms with Recommender Systems
Compliance Checklist After Meta Ruling
- Offer a non-profiled option: Implement a working chronological or non-algorithmic feed as alternative
- Make choice directly accessible: No deep menus - the option must be visible and prominent
- Respect user choice permanently: No automatic resets when closing app, switching sections, or changing devices
- Implement cross-platform synchronization: Choice must be preserved across web, iOS, Android
- Avoid manipulative design: No dark patterns to push users back to algorithmic feed
- Document implementation: Be prepared to demonstrate to supervisors or courts that you're compliant
- Monitor user behavior: Track how many users choose non-algorithmic feeds and respect that data
Recognizing Dark Patterns as Compliance Risk
The ruling emphasizes that design choices fall under DSA scrutiny. This means product managers, UX designers, and engineers must be aware of compliance implications of interface decisions.
Examples of dark patterns in recommender system context:
- Difficult opt-out: Making it easier to accept algorithmic feed than to refuse it
- Repeated asking: Continuously suggesting to return to algorithmic feed
- Obscured defaults: Making unclear what the default setting is
- Framed choices: Presenting algorithmic feed as "recommended" or "optimal experience"
- Asymmetric friction: Requiring more steps for non-algorithmic choice than for algorithmic
- Emotional manipulation: Suggesting user is "missing content" if they don't use algorithm
All these practices can now be challenged as DSA violations, with substantial penalties as consequence.
User Choice as Fundamental Design Principle
The ruling places user autonomy central. For platforms, this means a fundamental shift in how recommender systems are designed:
From: "What maximizes engagement and watch time?" To: "How do we give users meaningful control over their experience?"
From: "How can we keep users in our algorithmic feed?" To: "How do we make alternatives easily accessible and respect that choice?"
From: "Algorithm-first design" To: "User choice-first design"
This is not only a compliance requirement but can also become a competitive advantage. Platforms that actually give users control and are transparent about their operation can build trust in a time of increasing platform skepticism.
Future Perspective: Where Is This Heading?
The Meta ruling is a snapshot in a broader evolution of platform regulation. Let's explore some scenarios.
Appeal Procedure and Possible Escalation
Meta will file an appeal within a few weeks with a higher Dutch court. Possible outcomes:
Scenario 1: Appeal is rejected, ruling stands This strengthens the precedent and encourages similar lawsuits in other member states. Meta can then file cassation with the Supreme Court.
Scenario 2: Appeal succeeds, ruling is overturned This would be a setback for civil DSA enforcement but could bring the fundamental question to the European Court of Justice via a preliminary reference on the role of national courts in DSA enforcement.
Scenario 3: Preliminary reference to ECJ The Dutch court could itself decide to refer the jurisdictional question to the Court of Justice: may national courts force VLOPs to DSA compliance? This question has fundamental implications for the entire DSA architecture.
Expectation: Regardless of outcome, this will likely end up at the Court of Justice EU, because the jurisdictional question is fundamental to DSA enforcement and requires clarification at European level.
Digital Fairness Act: The Next Phase of Platform Regulation
Parallel to this lawsuit, the European Commission is developing the Digital Fairness Act, specifically aimed at misleading practices and dark patterns in digital interfaces. This legislation will likely:
- Provide concrete definitions of specific dark pattern types
- Introduce explicit prohibitions for manipulative design practices
- Create enforcement mechanisms with substantial fines
- Link consumer protection to platform governance
The Meta case delivers valuable jurisprudence that can inform the Digital Fairness Act about what "manipulative design" concretely means in the context of recommender systems.
Evolution Toward "User Empowerment by Design"
In the longer term, we can expect a shift from compliance-driven to design-driven user control:
Feed interoperability: Users can possibly combine feeds from multiple platforms via open protocols
Algorithm marketplaces: Users choose their own curation algorithms from third-party providers
Data portability for recommender systems: Users take their preference data between platforms
Transparent algorithm parameters: Users can adjust algorithm parameters (e.g., "more serendipity," "less virality")
These are still distant future scenarios, but the Meta ruling marks an important step toward platform architecture where user control is central, not platform optimization.
Conclusion: A New Era of Platform Regulation
The ruling by the Amsterdam District Court against Meta marks a turning point in the relationship between Big Tech platforms and European regulation. For the first time, a national court, on the initiative of a civil rights organization, forces a global platform giant to fundamental changes in how it offers its services.
The court sends a clear signal: user autonomy is not an optional feature but a fundamental right that is legally enforceable. Dark patterns are not clever design choices but prohibited manipulation that is penalized with fines.
For platforms, this means a paradigm shift. The era of "maximize engagement at all costs" is over. Design choices have compliance consequences, and user choice must be real, genuine, and permanent.
The core message for platform operators
The Meta ruling is not an incident but a preview of a new reality where user autonomy by design is no longer a differentiator but a basic compliance requirement. Platforms that understand and embrace this will not only avoid legal risks but also build trust in an increasingly skeptical digital ecosystem.
The coming months will be crucial. Meta's appeal, the implementation within two weeks, possible similar lawsuits in other countries, and the European Commission's response will determine how robust this new enforcement model is.
But one thing is clear: the DSA is no paper tiger. The combination of supervisors, national courts, and civil society organizations creates a powerful enforcement ecosystem that can actually change platform behavior.
For users, this means hope: the promise of the digital single market - platforms that respect European values - is getting closer. For organizations, this means urgency: DSA compliance is no longer preparatory work but operational reality.
Relevant Sources:
- Bits of Freedom: Court rules Meta must respect user choice
- Digital Services Act: full text
- DSA Article 25: Prohibition of dark patterns
- DSA Article 38: Obligations for recommender systems
Does your organization have questions about DSA compliance, dark patterns, or recommender system governance? Contact us for a no-obligation consultation on how to make your platform DSA-proof.