Digital Services Act - GDPR
Given the ubiquity of personal data, it is hardly surprising that the General Data Protection Regulation ("GDPR") extends its grasp across virtually all domains, save for a narrow set of exceptions. The Digital Services Act ("DSA") imposes its own set of obligations on intermediaries which, whilst not exactly clashing with the GDPR, creates sufficient overlap that may invoke confusion. In the exercise of the obligations under one regulation, the obligations of another are inadvertently triggered.
This leads us to the incidental question: How can one comply with both?
In their attempt to answer this question, the European Data Protection Board published their draft guidelines aptly named "Guidelines 3/2025 on the interplay between the DSA and the GDPR". These Guidelines seek to ensure a consistent and coherent application of both instruments by offering a degree of clarity on how intermediary service providers should handle personal data when complying with the DSA. It does so by highlighting specific issues and providing practical scenarios.
This article intends to provide a high-level summary on all mentioned facets.
[Article 7] Own-initiative investigations
The DSA permits providers to carry out voluntary, own-initiative measures to detect or remove illegal content without losing liability exemptions. The EDPB stresses that such measures often involve processing personal data, acknowledging also that such processing may occur via tools like machine learning. From a data protection standpoint, two main scenarios emerge:
I. Voluntary detection and removal
Where a provider proactively scans for illegal content, any processing must comply with the GDPR's principles.1
Technically, such processing is not legally mandated. The most appropriate legal basis is thus legitimate interests under Article 6(1)(f) GDPR. However, this requires careful application of the three-step test: the interest pursued must be legitimate, the processing necessary, and the data subjects' rights must not override the interest.2 The EDPB highlights the importance of whether data subjects could reasonably expect such processing.
II. Processing under a legal obligation
If the processing stems from a clear EU or Member State obligation, such as copyright takedowns under the Copyright Directive or erasure requests under Article 17 GDPR, the legal basis shifts to Article 6(1)(c), compliance with a legal obligation. In such cases, the obligation must be clear, precise, and foreseeable in its application, with processing limited to what is strictly necessary.
The EDPB further warns that reliance on automated tools can create significant data protection risks where such risks are exponentially aggravated for very large online platforms and search engines ("VLOPs/VLOSEs"), where even small error rates can translate into tens of thousands of affected users. In suspension of accounts or content removal, such activity may qualify as a decision based solely on automated processing under Article 22(1) GDPR. Unless there is a human in the loop, the exceptions as per Article 22(2) and additional safeguards must apply.
[Articles 16, 17, 20, 23] Notice-and-Action & Complaints
The Notice-and-Action is one of the main pillars of the DSA. Undoubtedly, whilst exercising such, complaint-handling systems inevitably involve processing the personal data of notifiers, affected users, and sometimes even third parties.
I. Personal data of notifiers
Article 16 DSA envisages that notifiers may submit their name and email address when reporting illegal content. The mechanism should allow, but absolutely not require personal identification, unless it is strictly necessary to establish illegality such as in IP infringement cases. If the notifier's identity must be disclosed to the affected party, this must be limited to what is strictly necessary, and the notifier must be informed in advance, in line with Articles 13–14 GDPR.
II. Personal data of affected users
When action is taken against a user's content, providers must issue a statement of reasons under Article 17 DSA, which may include information on whether automated means were used to arrive at such a result.
Importantly, Article 20 DSA requires complaint-handling to be carried out under the supervision of appropriately qualified staff, whereas fully automated decisions are not allowed. This automatically aligns with Article 22 GDPR, which ensures that users are not subject to decisions based solely on automated processing. Along the same line, given certain actions3 can significantly affect data subjects, providers must ensure accuracy, transparency and data minimisation, as per the GDPR.
[Article 25] Dark Patterns
The DSA prohibits manipulative interface designs that impairs a user's ability to make autonomous and informed decisions. On this, the EDPB provides a two-part test to determine whether a deceptive design pattern falls under the GDPR:
- Does the pattern involve the processing of personal data?
- Is the manipulation linked to how personal data is processed?
If the answer to both is "yes", the GDPR applies. For instance:
- Covered by GDPR: "Only 3 items left! Enter your email address now to reserve one." Here, manipulation is tied to collection of additional personal data.
- Not covered by GDPR (however DSA applies): "Only 3 items left!" aimed at general consumer urgency without triggering any data processing.
The EDPB also flags design patterns causing addictive behaviour, such as infinite scroll, autoplay, and gamification as "potentially manipulative in ways that relate to personal data processing". If these patterns are built on profiling or tracking users, they fall squarely within the GDPR's scope.
[Article 26] Advertising Transparency
The DSA requires platforms to show in real time and directly from the advertisement itself, the main parameters used to determine why a particular user is targeted. This is a marked shift from the GDPR's ex-ante transparency obligations, which require information to be provided at the point of data collection.4
Profiling for advertising may trigger Article 22 GDPR if it produces legal or significant effects. Such qualification requires safeguards such as human review and the right to contest. Crucially, Article 26(3) DSA imposes an absolute ban on ads based on profiling using special categories of data,5 even if consent or other derogations under the GDPR could otherwise apply.
[Articles 27, 38] Recommender Systems
Algorithms that prioritise or rank content, also known as recommender systems, are central to platform design. Of course, when they rely on personal data, they fall directly under the GDPR.
Behavioural recommendations typically involve profiling. Where outputs shape significant opportunities6 they may qualify as decisions under Article 22 GDPR, triggering its berth of safeguards such as transparency, human review and the right to contest. From the DSA's side, platforms must disclose the main parameters driving such recommendations and ultimately allow end-users to adjust them. For VLOPs/VLOSEs, at least one non-profiling option must be offered, displayed neutrally and without "nudging".
[Article 28] Protection of Minors
When the end-user is a minor, the DSA requires platforms accessible to minors to ensure a high level of privacy and security, and chiefly, bans personalised advertising based on profiling. These duties can ground processing under Article 6(1)(c) GDPR (legal obligation), but only where processing is necessary and proportionate. The EDPB cautions against intrusive methods.7 Privacy-preserving or minimal approaches (i.e. age ranges, contextual signals) should be always preferred. Ultimately, platforms must not store exact ages or age ranges beyond what is needed to check compliance.
[Articles 34, 35] Systemic Risk Assessments
VLOPs/VLOSEs must assess and mitigate systemic risks. If systemic risks involve large-scale profiling, recommender systems, or adtech, a Data Protection Impact Assessment8 (Article 35 GDPR) will almost always be a mandatory exercise. Article 35 DSA measures (i.e. adapting interfaces, testing algorithms) must mirror data protection by design and by default as per Article 25 GDPR.
Consultation
The EDPB is welcoming comments on the Guidelines. Feedback can be submitted on their portal by the 31st of October.
Conclusion
The draft Guidelines make one thing painfully clear: the DSA and the GDPR are not parallel tracks but intersecting lanes. Providers cannot treat them in isolation and as analysed, the compliance with the DSA will often trigger GDPR obligations.
Saliently, both frameworks carry significant penalties.
For service providers, the challenge is not simply to "tick boxes" under each regulation; given modern software and its potentially complexities, the main challenge is to design strategies that acknowledge the overlaps. This is no small feat as a lot of obligations fall into the crosshairs of both regimes.
To put it bluntly, navigating this regulatory labyrinth requires careful planning and often, expert guidance.
Footnotes
1. I.e. lawfulness, accuracy, transparency etc.
2. Read more on thesmall>e.e.
3. Such as account suspension.
4. A good example of which are Privacy Notices.
5. Article 9, GDPR
6. Such as job vacancies, housing opportunities, etc.
7. Such as ID document uploads.
8. Article 35, GDPR
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.