- within Privacy, Cannabis & Hemp and Law Practice Management topic(s)
On 12 September 2025, the European Data Protection Board ("EDPB") published guidelines 3/2025 to clarify the relationship between the General Data Protection Regulation ("GDPR") and the Digital Services Act ("DSA") (for an analysis on the DSA see our article here). The objective of the guidelines "is to clarify how intermediary service providers should interpret and apply the GDPR when processing personal data in the contexts covered by the DSA". These guidelines were subject to public consultation until 31 October 2025 and the final guidelines are awaited.
What is the key takeaway? The DSA adds new duties for online intermediaries, but it does not grant a free pass to process personal data. Every DSA-driven processing operation still needs a valid GDPR basis, must respect data minimisation and purpose limitation, and remain within the GDPR's guardrails on profiling, special category data and children.
The EDPB Guidelines clarify the following:
- No new legal basis: The DSA does not create a new standalone lawful basis for processing. Controllers must ground each DSA-related processing operation in Article 6 GDPR (and Article 9 for special category data), and ensure that all GDPR principles are met.
- It is essential to choose the right lawful
basis, for example:
- Legal obligation (Art 6(1)(c)) can cover processing strictly necessary to comply with express DSA duties (e.g., trader traceability, ad repositories, transparency reporting, notice-and-action systems, VLOP/VLOSE risk assessments and mitigation).
- Legitimate interests (Art 6(1)(f)) may apply to certain safety measures not strictly mandated but that are aligned with DSA objectives (subject to a documented Legitimate Interest Assessment and robust safeguards).
- Consent (Art 6(1)(a)) remains key for tracking-based profiling, cookie/adtech operations, and certain recommender choices where user consent is required under ePrivacy or GDPR.
- Public interest (Art 6(1)(e)) is generally reserved for authorities, not private platforms, unless a specific law designates the task.
- Special category data stays special: The DSA prohibits targeted advertising based on special category data. If such data is inferred (e.g. from engagement), you need an Article 9 condition - or avoid the processing altogether! – "This means that the GDPR requires specific derogations regarding profiling by online platforms using special categories of data for advertising purposes. On the other side, the DSA prohibits the presentation of any advertising based on profiling using such special categories of personal data by providers of online platforms to recipients of the service, regardless of whether this profiling is carried out by providers of online platforms or by others."
- Children first (Article 28): The DSA bans targeted ads to minors, so controllers must implement proportionate age-assurance and disable targeting for under-18s. Any age-checking must be privacy-preserving and minimised. The EDPB statement on age assurance is helpful in this regard, setting out ten principles to ensure compliant processing of personal data when determining the age or age range of an individual.
- Profiling and automated decisions: Recommender systems and moderation tools must respect GDPR rules on profiling and, where relevant, Article 22 safeguards. This means when using recommender systems and/or moderation tools meaningful information about why certain information is suggested or prioritised, being clear as to why it is presented in a certain order and/or with a certain prominence, as well as setting out the main parameters influencing the suggestions will need to be provided, opt-outs where required will need to be made available, and the ability for decisions with significant effects to be reviewed by a human will need to be in place, e.g. where a potential job candidate challenges an automated decision that used a recommender model to shortlist and rank candidates, a human must undertake a meaningful review of the decision as it has a significant effect in that a candidate could be denied an interview opportunity and potentially access to a new job.
- Recommender transparency does not equal a consent bypass: Article 3(s) DSA defines a "recommender system" as a partially or fully automated system used by online platforms to present specific content to users of the platform with a certain relative order or prominence. Offering non-profiling options under the DSA does not replace GDPR-grade consent for tracking. Dark patterns are prohibited, and choices must be fair, granular and reversible.
- Notice-and-action (Article 16 and 17 DSA): Providers of hosting services should implement mechanisms that enable individuals or entities to report illegal content through notifications. These notifications may involve the processing of personal data, particularly when initiated by an individual, and may also trigger processing of third-party data where necessary to identify the illegal content. Processing data about notifiers, users, and alleged infringers must be necessary and proportionate. Controllers should be transparent about purposes, retention periods, and disclosures (for example, to trusted flaggers), and should also address the conditions under which the notifier's identity may be disclosed, ensuring only "strictly necessary personal data of the notifier is communicated", e.g. where there is an infringement of IP rights and it is necessary to disclose to establish the unlawful nature of the content.
- Complaint handling (Article 20 and 23): The DSA includes specific provisions around complaints handling, in particular, complaints must be reviewed by staff and not decided solely by automated means, and the related processing should respect the GDPR. Complaint handling should be designed to ensure compliance with all Article 5 GDPR principles - especially data minimisation, accuracy, transparency, and storage limitation. Online platform providers should be transparent towards data subjects about any processing carried out in the context of a complaint and provide all information required to meet the transparency requirements under the GDPR. Controllers should clearly set out the purposes of processing, applicable retention periods, and any disclosures (for example, to trusted flaggers).
- Trader traceability (KYC-lite) (Article 30): The DSA requires online marketplaces to collect and verify information from traders selling to consumers, ensuring they can be identified and located, and to ensure traders only offer products and services that are compliant with EU law. The guidance reiterates that only data that is required by the DSA to verify business users should be collected, and strict retention policies tied to DSA needs should be in place.
- Ad repositories and transparency requirements (Article 26): Article 26 of the DSA mandates transparency and bans targeted advertising based on profiling using special category personal data. Publishing and preserving information in ad repositories (including the advertiser's identity and the main targeting parameters) should be limited to what Article 26 mandates, applying data minimisation. Unnecessary personal data should be redacted, and risks to individuals should be assessed and mitigated. As to transparency, the EDPB recognises there is an interplay between the DSA and the GDPR as both regimes impose transparency obligations relevant to the delivery of advertisements on online platforms. However, the DSA adds modality‑specific requirements by specifying how the information must be provided and the expected level of information and control, for example, communicating the main parameters used to determine why a particular advertisement is shown to a recipient and, where available, the possibilities to change those parameters.
- Data subject rights remain intact: DSA duties do not waive access, objection, erasure or restriction rights. Any exceptions (such as to preserve evidence) must be lawful, necessary and documented.
- ePrivacy rules remain in force: Cookie or storage access still requires consent where mandated – the DSA does not change that.
- International transfers: Any DSA-related data sharing outside the EEA needs Chapter V tools (such as adequacy decisions or standard contractual clauses) and transfer risk assessments.
- Governance and enforcement will involve parallel oversight: Digital Services Coordinators and the European Commission for the DSA and Data Protection Authorities for the GDPR. Building a joined-up compliance record is essential to satisfy both regimes.
- Deceptive design patterns (Article 25): Online platforms must "design, organise, and operate" their interfaces in a way that enables recipients to make "autonomous and informed" decisions. The DSA provides a clear, general prohibition on deceptive design patterns for providers of online platforms. In parallel, under the GDPR, processing must comply with Article 5(1)(a) -lawfulness, fairness, and transparency - and the EDPB underscores that deceptive design practices covered by the GDPR are generally unlawful. In practice, the DSA's prohibition and the GDPR's principles converge, manipulative interface designs that obscure or nudge users into choices (e.g. consent flows that are not truly informed or freely given) will typically breach the DSA and also contravene the GDPR's transparency and fairness requirements.
- Risk Assessment and mitigation (Articles 34 and 35): Very large online platforms and search engines must assess, at least annually, the systemic risks stemming from the design and operation of their services and implement proportionate mitigation measures. Where those risks implicate the processing of personal data, assessments should be aligned with GDPR principles and, where appropriate, integrated with or complemented by a GDPR data protection impact assessment (DPIA) to ensure coherent identification of harms and safeguards. Mitigation should be iterative, evidence‑based, and documented, with clear accountability for measures affecting recommender systems, advertising, and content moderation workflows. In carrying out these duties, providers should co-operate with competent authorities under the DSA framework and observe a duty of sincere co-operation with data protection authorities, facilitating information‑sharing and coordinated supervision so that risk findings and mitigations are consistent with both the DSA's systemic risk objectives and the GDPR's requirements.
So, what are the practical takeaways?
To stay compliant, organisations should map their processing activities to specific DSA obligations. For each duty (such as ad transparency, risk assessments, recommender options, trader traceability, and researcher access) identify the personal data involved, the GDPR basis for processing, and the retention and safeguard measures in place. Avoid using the "DSA says so" as a blanket justification...instead, tie the legal obligation strictly to the relevant DSA article and collect only the minimum necessary data.
Targeting controls should also be tightened. This means disabling special category data based targeting and any inferences that could reveal special categories, as well as turning off targeted ads for minors and implementing proportionate age assurance. When offering recommender systems with choice, organisations should provide genuinely non-profiling options, ensure consent where tracking is used, and avoid dark patterns.
It's also important to update privacy notices, Article 30 records, and vendor contracts, aligning DSA transparency requirements with GDPR information duties to prevent duplication or contradiction. Organisations should have DPIAs for recommender changes, systemic risk mitigation, researcher sharing, and large-scale moderation profiling. When preparing for data access by researchers, build secure pipelines and governance structures, focusing on minimisation, access logging, and clear role and responsibility allocation.
Finally, train your teams! This training should be across policy, product, ads, trust & safety, and privacy engineering, so everyone shares a unified understanding of what the DSA requires and what the GDPR allows.
Why does this matter?
The DSA raises the bar on transparency, safety and accountability for online platforms. The EDPB's message is clear - implement those duties through a GDPR-first lens.
This means if you can't justify the data, don't collect it; if you can't secure it, don't share it; if you can't explain it, don't deploy it. That approach will keep you compliant with both regimes—and out of the crosshairs of two sets of regulators!
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.