DMA and GDPR United: Why These New Guidelines Change the Rules for Major Platforms

16 min read
Dutch version not available

Practical implications for organizations working with major platforms and their data ecosystems

Historic precedent: On October 9, 2025, the European Commission and EDPB jointly published regulatory guidelines for the first time. This marks a new phase where competition law and privacy protection are deliberately coordinated.

Why these guidelines reach beyond just gatekeepers

On October 9, 2025, the European Commission and the European Data Protection Board (EDPB) published a set of joint guidelines clarifying how the Digital Markets Act (DMA) and the General Data Protection Regulation (GDPR) interact with each other. It is the first time these two regulatory bodies have jointly developed guidelines.

The DMA, in effect since 2023, focuses on promoting fair competition in digital markets. The GDPR protects individual privacy rights. Where these two regulations intersect, considerable legal uncertainty has existed until now. These new guidelines provide concrete guidance for the first time.

While the guidelines primarily target the six designated gatekeepers (Alphabet, Amazon, Apple, ByteDance, Meta, and Microsoft), they have broader implications for any company working with these platforms or offering data-intensive services.

The six core areas being clarified

1. Consent for data combination: the end of implicit bundling

The most concrete part of the guidelines concerns Article 5(2) of the DMA, which prohibits gatekeepers from combining personal data across different services without specific, GDPR-compliant consent.

What does 'specific choice and valid consent' mean?

The guidelines specify that consent must meet all GDPR requirements: freely given, specific, informed, and unambiguous. Concretely, this means: no pre-ticked boxes, no manipulative interface designs (dark patterns), and separate consent per purpose.

For advertising personalization, this means a gatekeeper cannot simply combine data from a search service with data from a video platform without explicit, separate consent for that specific combination.

Practical consequence for gatekeepers: Meta can no longer automatically combine Facebook data with Instagram or WhatsApp data for advertising purposes. Google must request separate consent to link YouTube behavior with search history. Amazon must explicitly ask whether they may combine shopping behavior with Prime Video preferences.

Practical consequence for organizations: Companies advertising through these platforms will face fragmented datasets. The rich, cross-platform profiles that much targeting was based on become less accessible. This requires a recalibration of marketing strategies.

2. Data access for business users: who is responsible?

Article 6(10) of the DMA requires gatekeepers to grant business users access to personal data of end-users, but the guidelines now clarify the GDPR consequences.

Core principle: When a gatekeeper shares data with a business user, both function as separate data controllers. The gatekeeper cannot hide behind the role of processor.

This has three direct implications. For gatekeepers, this means they must clearly inform end-users that data is shared with a third party, they remain responsible for the lawfulness of the original data collection, and they must implement technical mechanisms whereby end-users can provide consent per business user.

For business users, this brings obligations as well. They must have their own GDPR legal basis for processing the received data and cannot rely on the consent the gatekeeper obtained. Additionally, they must themselves comply with transparency obligations toward end-users.

Practical example: An e-commerce company using Google Shopping must now:

  1. Itself request consent from users to receive data via Google
  2. Document why they need this data (GDPR legal basis)
  3. Inform users about how they process the received data
  4. Maintain their own processing registry

3. Data anonymization: a high technical and legal bar

Article 6(11) of the DMA requires gatekeepers to share anonymized ranking, query, click, and view data with search engines and social media platforms. The guidelines emphasize that this anonymization must meet GDPR standards.

GDPR standard for anonymization

The guidelines refer to the EDPB definition: data is only anonymous if re-identification is irremediably impossible. This goes beyond simply removing names or IDs - it requires a thorough risk assessment examining available technical means for re-identification, combination with other datasets, and statistical approaches like group attacks.

Practical challenge: Many datasets companies consider "anonymized" do not meet this strict standard. Research has repeatedly shown that seemingly anonymous data can often be re-identified through clever combinations with other sources.

Compliance requirement: Gatekeepers must not only implement technical anonymization methods but also include contractual clauses prohibiting re-identification attempts, conduct regular audits on anonymization effectiveness, and document why their method meets the "irremediably impossible" standard.

4. Data portability: going beyond what GDPR ever intended

The DMA's portability right (Article 6(9)) goes significantly further than Article 20 GDPR, and the guidelines clarify these differences.

AspectGDPR Article 20DMA Article 6(9)
Applicability conditionOnly with consent or contract as legal basisWith any processing basis
Type of dataUser-provided dataProvided and generated data
FrequencyOn requestReal-time/continuous access
FormatStructured, commonStructured, machine-readable (e.g., JSON)
Data about othersExcludedIncluded (with restrictions)

Unique complexity: data about others

One of the most innovative aspects is that the DMA requires portability of data about other individuals generated through the user's activity. Think of: interactions with others on social media, collaborative playlists, or shared documents.

The guidelines state this is only permissible if:

  1. The original user provides explicit consent
  2. Data about others is filtered based on granular user tools
  3. There are mechanisms to protect the rights of other data subjects

Practical example: A user wanting to export their Facebook data to a competitor platform can export own posts and likes under standard portability rights. Additionally, they can export interaction data with friends via the DMA extension, whereby filtering is applied. What cannot be exported are the complete profiles of friends, which remains protected under privacy regulations.

5. Messaging service interoperability: privacy by design in practice

Article 7 of the DMA requires gatekeepers with messaging services (such as WhatsApp, Messenger) to enable interoperability with other services. The guidelines clarify the GDPR safeguards.

Core requirement: Gatekeepers implementing interoperability must conduct a Data Protection Impact Assessment (DPIA) and can only share "strictly necessary" personal data.

What is "strictly necessary"? The guidelines indicate this is limited to message content (if end-to-end encrypted), user identifiers for routing, and essential metadata such as timestamp and message type. What cannot be shared without additional consent are location data, contact lists, usage statistics, and profile information beyond basic identity.

Practical implementation: WhatsApp must, when implementing interoperability with Signal:

  1. Maintain end-to-end encryption across platforms
  2. Only share minimal metadata for message routing
  3. Let users choose which opt-in features they want (read receipts, typing indicators)
  4. Document why specific data elements are necessary

6. Third-party app distribution: security without lock-in

Article 6(4) of the DMA requires gatekeepers to allow alternative app distribution (for example, sideloading on iOS). The guidelines emphasize this must be done securely according to GDPR principles.

Security requirements for third-party apps

Gatekeepers must implement several security measures. Sandboxing ensures apps may not access data from other apps without consent. Malware protection through scanning for malicious software is mandatory. Transparent consent requests must ensure users understand what data access apps request. Finally, E-Privacy Directive compliance requires explicit consent for tracking and cookies.

Practical consequence: Apple's implementation of sideloading in the EU must allow alternative app stores without Apple's review process while maintaining security mechanisms such as sandboxing APIs. Discriminatory restrictions on third-party stores are not permitted. Apple must clearly warn users of risks without blocking alternative stores.

Enforcement: who does what?

A crucial part of the guidelines is the clarification of enforcement responsibilities.

Divided enforcement

European Commission enforces DMA violations and can impose fines up to 10% of global annual turnover. For repeated violations, this can increase to 20%. The Commission examines market power abuse and anti-competitive behavior.

National supervisory authorities (such as data protection authorities) enforce GDPR violations and can impose fines up to €20 million or 4% of global turnover, whichever is higher. They focus on privacy rights and data processing principles.

Coordination mechanism: The guidelines introduce a cooperation protocol whereby the Commission informs supervisory authorities of DMA procedures with GDPR implications, and supervisory authorities inform the Commission of GDPR cases against gatekeepers. Joint fact-finding can occur on overlapping issues, while final enforcement decisions are coordinated to prevent contradictions.

What this means for non-gatekeepers

While the guidelines target the six designated gatekeepers, they have broader implications:

For business users of gatekeeper platforms

1

Review your GDPR legal basis

You can no longer rely on consent the gatekeeper collected. Evaluate whether you have your own lawful basis.

2

Update contracts

Ensure contracts with gatekeepers clearly establish that you act as a data controller, not processor.

3

Implement transparency

Directly inform end-users about how you receive and process data from gatekeepers.

4

Prepare for fragmentation

Cross-platform profiles become scarce. Develop alternative targeting strategies.

For future gatekeepers

The guidelines create a precedent likely to apply to future gatekeepers. Organizations growing toward gatekeeper status must proactively:

  1. Design separate consent flows for different services and purposes
  2. Build portability infrastructure enabling real-time export
  3. Implement anonymization processes that are GDPR-proof
  4. Develop interoperability architecture with privacy by design

For market players competing with gatekeepers

The guidelines create opportunities for competitors. Data portability lowers switching costs for users, interoperability enables communication with gatekeeper ecosystems, and separate consent breaks the data advantages of bundling.

Strategic implication: Smaller platforms can now more easily attract users by offering better privacy protection as a differentiator, embracing interoperability to share network effects without lock-in, and building tools that automate data import from gatekeepers.

Practical compliance roadmap

For gatekeepers: 90-day action plan

Immediate priorities (Week 1-4)

Week 1-2: Gap analysis Inventory all current data combinations between services, identify which combinations now occur without specific consent, and evaluate current consent mechanisms against GDPR requirements (freely given, specific, informed).

Week 3-4: Legal review Determine for each data sharing with business users whether contracts correctly assign controller responsibility. Review anonymization processes against EDPB standards and check whether portability implementations meet DMA requirements (real-time, continuous, JSON format).

Month 2: Technical adjustments Implement granular consent interfaces without dark patterns and build data segregation between services into infrastructure. Develop portability APIs meeting specifications and strengthen anonymization techniques while documenting methodology.

Month 3: Testing and documentation Test consent flows with real users via A/B-testing without manipulative variants. Conduct DPIAs for interoperability features and document all decisions and trade-offs for future audits. Train legal and product teams on the new requirements.

For business users: quick compliance check

Ask yourself these five questions:

  1. Do we have our own GDPR legal basis for data we receive from gatekeepers, or do we rely on their consent?
  2. Do we directly inform end-users about our data processing practices, or do we only refer to the gatekeeper?
  3. Do our contracts explicitly state we are data controllers, or is this ambiguous?
  4. Do we have mechanisms to manage consent per end-user, or do we process bulk data?
  5. Do we document why we need specific data elements, or do we request "as much as possible"?

If you answer "no" to any of these questions, action is required.

The broader shift: convergence of competition and privacy

These guidelines mark a fundamental shift in how European regulators view digital markets.

Regulatory maturity: For the first time, we see explicit coordination between competition law and privacy protection. This signals that Europe views digital markets as a coherent ecosystem where market fairness and data sovereignty are inextricably linked.

Three strategic implications

1. The end of "privacy versus innovation" The guidelines demonstrate that strict privacy requirements and market innovation need not be contradictory. By establishing transparent rules, regulators create predictability within which innovation can flourish.

2. Data as competitive instrument The explicit attention to data portability and access recognizes that data control is a crucial form of market power. This precedent will likely extend to other sectors (think smart home, automotive, health tech).

3. Global norm-setting Just as GDPR became the global standard for privacy legislation, these DMA-GDPR guidelines will likely influence how other jurisdictions think about platform regulation. We already see parallel developments in the UK, Australia, and parts of Asia.

Public consultation: your voice matters

The guidelines are now in public consultation until December 4, 2025. Feedback can be submitted via the Digital Markets Act consultation portal.

Who should provide feedback?

Gatekeepers seeking clarity on specific implementation questions, business users experiencing uncertainties about their obligations, privacy advocates with input on end-user protection, technical experts with insights on practical feasibility, and academics with research data on measure effectiveness.

The final version is expected to be published in early 2026 after assessing the feedback.

Outlook: expected developments

Enforcement cases coming

Now that the guidelines are available, we expect the European Commission will more quickly initiate DMA enforcement cases against gatekeepers non-compliant with consent requirements. Supervisory authorities will conduct focused GDPR audits of gatekeepers in areas the guidelines emphasize. First fines will be issued containing both DMA and GDPR components.

AI Act integration

As announced, the EDPB and the Commission's AI Office are working on similar joint guidelines for the interaction between the AI Act and GDPR. We expect these will be published in Q1 2026 and follow a similar pattern: clarification of overlapping obligations, concrete guidance on consent for AI systems, and enforcement coordination between AI oversight and privacy supervisory authorities.

International ripple effects

As GDPR became a global standard, we expect this DMA-GDPR interplay to gain worldwide adoption. The UK's Digital Markets, Competition and Consumers Act will likely publish similar guidance. Australia's platform regulation (in development) may adopt the same principles. US states considering platform regulation will look to these guidelines as precedent.

Five concrete recommendations for organizations

1. Start a DMA-GDPR gap analysis, even if you're not a gatekeeper

The principles from these guidelines (separation of consent purposes, transparency about controller responsibility, strict anonymization) set a new standard for all data-intensive platforms. Begin by inventorying where you currently combine data without granular consent, evaluating whether your anonymization processes pass the "irremediably impossible" test, and checking whether business partners understand they are data controllers.

2. Redesign consent flows with user empowerment as starting point

The guidelines make clear that consent is not a formality but an actual choice. Remove all pre-ticked boxes and test interface designs for manipulation (dark patterns). Implement equally easy "decline" buttons as "accept" buttons and make withdrawal of consent as simple as granting.

3. Document, document, document

The guidelines repeatedly emphasize documentation obligations. Ensure DPIAs for every new data combination or portability feature, detailed explanation why specific data elements are necessary, audit logs of anonymization processes, and records of internal compliance decisions. This documentation is not only legally required but also protects you in future audits or enforcement cases.

4. Prepare your organization for data fragmentation

As cross-platform data combinations become more difficult, you must strengthen first-party data strategies and rediscover contextual targeting (content-based instead of profile-based). Explore privacy-preserving technologies such as federated learning and adjust expectations about available user profiles.

5. View compliance as competitive advantage

Organizations proactively responding to these guidelines can build trust with users tired of opaque data practices, attract talent valuing ethical technology development, minimize enforcement risks and prevent reputational damage, and realize early-mover advantages in a shifting competitive landscape.

Conclusion: toward a new balance between power and privacy

These guidelines mark more than technical compliance requirements. They signal a fundamental revision of how digital platforms function in Europe.

The core message is clear: market power does not justify privacy compromises. On the contrary, the more powerful a platform, the stricter the safeguards users deserve.

For gatekeepers, this means a period of significant adjustment, where legacy systems built on data bundling must be redesigned around principles of user control and granular consent.

For smaller market players, this opens strategic opportunities to compete based on superior privacy practices and user empowerment.

For end-users, it means - in time - more control over their digital identity and easier possibilities to switch between services without lock-in.

The consultation runs until December 4, 2025. Organizations with specific questions or concerns about implementation are encouraged to submit feedback. The final guidelines, appearing in early 2026, will weigh this input.

The convergence of competition law and privacy protection in these guidelines is no coincidence. It is a deliberate choice to reform digital markets toward a model where competition and privacy go hand in hand, not stand opposed.

For organizations willing to embrace this shift, opportunities lie ahead. For those clinging to old models of data extraction and user lock-in, the playing field becomes increasingly challenging.

The question is no longer whether this shift occurs, but how quickly your organization can adapt to thrive in this new reality.


Sources and further reading