- within Privacy topic(s)
- within Privacy, Cannabis & Hemp and Law Practice Management topic(s)
- with readers working within the Retail & Leisure industries
In our earlier article, we discussed the European Commission's Digital Omnibus Regulation Proposal, an initiative to streamline and modernise the EU's digital rulebook. This follow-up takes a deeper dive into some of the data, e-privacy and cyber angles.
It is important to note that this Omnibus is subject to the usual EU legislative processes (e.g. "passing" the European Parliament) and so will be subject to change.
Overview
Positioned as a "first step" towards optimising compliance and competitiveness, the proposal includes a set of "technical amendments" to "digital legislation" with a focus on "unlocking opportunities in the use of data, as a fundamental resource in the EU economy". To support this objective, the proposal includes targeted updates to "data protection and privacy rules" contained within Regulation (EU) 2016/679 (General Data Protection Regulation) (GDPR) and Privacy and Electronic Communications Directive 2002/58/EC (e-Privacy Directive).
Rather than rewriting its requirements, the proposal focuses on core concepts such as providing clearer definitions (notably redefining the parameters of personal data), clarifying requirements for information duties and breach notifications, updated rules on automated decision making and processing for AI training and development, and streamlined consent rules designed to reduce "cookie fatigue".
Key takeaways
- Personal data: a narrower, context based definition
The proposal clarifies the definition of personal data so that, for an organisation, information is not considered personal data if that organisation has no realistic way of identifying the individual. Put simply, if an organisation cannot "reasonably" identify an individual from the data it holds, the GDPR would not apply to that data. This differs from the current interpretation, where the definition is much stricter and more in line with a civil code approach i.e. even if an individual cannot be identified, the data is still treated as personal data if anyone else could reasonably re-identify the person using available means.
The proposed change moves away from this absolute approach which often forces organisations to apply the GDPR to data that they cannot actually link to an individual. It also aligns to the recent decision of the European Court of Justice (Case EDPS v SRB C-413/23 P) where an absolute approach was rejected, here the Court held pseudonymised data was not always personal data. It was also confirmed that "the identifiable nature of the data subject must be assessed at the time of collection of the data and from the point of view of the controller".
The proposal is in fact similar to how the Information Commissioner's Office (the ICO) in the UK approaches the question of the width of personal data. There will be much commentary about this Omnibus potentially being a result of the "Washington effect" battling, and winning over, the "Brussels effect" but in fact is there a soft "UK effect" in play here? Perhaps and see below in relation to the Omnibus's proposed changes to automated decision making, and cookies, amongst other things.
- Pseudonymisation: scope and risk criteria
On a similar note, the proposal aims to clarify the scope of pseudonymous vs truly anonymous data by giving the European Commission and European Data Protection Board the authority to set criteria for:
- Determining when pseudonymised data should still be considered personal data, and
- Measuring the risk of re-identification from pseudonymised data.
The proposal moves toward a contextual, risk-based model and introduces EU level criteria to ensure consistency. In practice, this means less guesswork for organisations offering simpler compliance obligations and a more consistent approach across the EU.
- Targeted exceptions around processing special category data
The most controversial element of a leaked earlier version of the Omnibus has gone (namely the limited scope of "inferred" special category data) but the Omnibus does still suggest some changes.
Under Article 9 of the GDPR, processing special category data is generally prohibited unless a specific exemption applies. The proposal introduces two additional exemptions to that list namely:
- Biometric verification under user control. An
exemption to the "general prohibition" to
processing biometric data would be introduced where safeguards are
put in place, where it is "necessary" to verify
the "identity of the data subject", and
crucially, when the data subject stays in "the sole
control" of the process e.g. biometric identification to
access apps where the biometric data is "securely stored
solely" by the data subject or by the controller in
"a state-of-the-art encrypted form". Whilst
these exemptions are proposed, if enacted they would remain
conditional on implementing the safeguards (as mentioned) and
ensuring full compliance with existing data protection
principles.
- Use in AI development. The development of AI systems and models may involve the collection of large amounts of data, including special category data. To avoid hindering AI innovation, the proposal introduces an exception to the general prohibition on processing special category data where such data forms part and remains in the "training, testing or validation data sets" of the AI system or model, and subject to the controller implementing "appropriate technical and organisational measures". These measures include (1) actively preventing further use of such data "during the entire lifecycle of the AI system or model", (2) "effectively" removing it when identified, or (3) where removal would require a "disproportionate effort" to "effectively protect such data from being used" in outputs or disclosed to third parties. This exception will not apply in situations where the processing of special category data is "necessary for the purpose of processing" within the AI system or model i.e. processing special category data is genuinely necessary for the AI system's purpose. In such cases, the existing exemptions within Article 9(2)(a)-(j) of the GDPR must be relied upon or such processing cannot take place..
- Training AI Models
The proposal sets out that "legitimate interest" will be explicitly codified as a lawful basis for processing personal data to train AI models, provided that appropriate safeguards are in place.
This means controllers must still conduct a GDPR balancing test and respect individuals' right to object (opt-out). However, this does not override stricter requirements in other EU or national laws, which may still mandate consent for certain types of data or contexts. Special category data remains subject to Article 9 safeguards (with the exceptions set out above), and additional conditions apply when processing for bias detection or correction.
Again this is something that broadly reflects the already extant UK position in this area.
- Tackling "abusive" data subject access requests (SARs)
The proposal seeks to amend Article 12 of the GDPR by clarifying that the right of access under Article 15 of the GDPR must not be subject to "abuse" by the data subject for obtaining information about their personal data for "purposes other than the protection of their data".
Examples of abuse include pressuring controllers for compensation or threatening a claim for damages, making requests with the intent to cause damage or harm or seeking benefits in exchange for withdrawing a request. To further support controllers, the proposal also sets out to establish a "lower burden of proof" to show a request is excessive rather than to show it is manifestly unfounded. It also adds that "overly broad and undifferentiated requests" should be considered as "excessive", giving organisations a clearer ground for refusal.
There are again similarities here with ICO guidance about the scope of manifestly excessive or unfounded SARs. We will have to wait and see how and if this proposal can be squared with Article 8(2) of the Charter of Fundamental Rights of the European Union (namely "....Everyone has the right of access to data which has been collected concerning him or her").
- When you may not need to provide a privacy notice
The proposal lightens the load on businesses when it comes to informing individuals about how their data is processed. Where a controller collects data directly from a data subject it permits organisations to skip this requirement if "there are reasonable grounds to assume that the data subject already has the information" unless the data is being shared with others, sent outside the EU, used for automated decision making or the processing could pose a high risk to the data subjects' rights.
- Requirements for automated decision making (ADM)
The proposal aims to clarify Article 22 of the GDPR in order to provide "greater legal certainty" for decisions made through ADM. It clarifies that when deciding if an automated decision is necessary for "entering into, or performance, of a contract" it does not matter if the decision could be taken otherwise than by solely automated means. The idea is to make it easier to rely on the necessary for performance of a contract basis for ADM usage.
This change is notable when compared to the UK's approach under the Data (Use and Access) Act 2025 (DUAA) (see our article), which goes even further towards a more innovation-friendly, permission-based regime, subject to safeguards, rather than maintaining the EU's prohibition with exceptions model. Organisations operating across both jurisdictions will still need to manage these differences carefully, particularly as the UK's stance will clearly allow wider AI deployment in many different areas.
- Breach notifications and incident reporting
The proposal introduces a more risk-based approach to breach notifications. Controllers would only need to notify the Data Protection Authority if the breach is likely to pose a high risk to individual rights, reducing unnecessary reporting for low-risk incidents. Importantly, this "higher threshold" for notification "does not affect the obligation of the controller to document the breach" (Article 33(5) of the GDPR). The proposal also gives organisations extra breathing room by extending the notification deadline from 72 to 96 hours.
In addition, the proposal creates a "single entry point" for reporting incidents, a model spanning the GDPR, the e-Privacy Directive, NIS2 Directive, DORA, and the Critical Entities Resilience Directive. In practice, this means a simpler, more streamlined process for compliance across multiple regulatory frameworks.
- Harmonising DPIA practices
Existing obligations require organisations to conduct a data protection impact assessment (DPIA) when the data processing is "likely to result in a high risk to the rights and freedoms of individuals". Currently, each EU member state maintains its own list of activities that require a DPIA, creating complexity for businesses operating across borders. The proposal seeks to harmonise these lists at EU level, thereby "replacing existing national lists" and reducing fragmentation and uncertainty. In addition, the European Data Protection Board will create a "common template and common methodology for conducting" DPIAs making it easier for organisations to understand when and how to perform them. The result, clearer more consistent guidance for assessing high-risk data processing.
- Expanding the scope of scientific research
The proposal aims to extend the definition of what constitutes as scientific research "clarifying the conditions". In addition, it proposes "to extend the exceptions from the information obligation for processing", meaning that when data is processed for scientific research purposes, organisations may benefit from relaxed transparency rules.
- Simplifying cookies and device level personal data
The interplay of the GDPR and the e-Privacy Directive has created "legal uncertainty" and "increased compliance costs" particularly for cookies and tracking technologies. Organisations often face fragmented enforcement, with different national regulators responsible for enforcing different parts of the framework, adding yet another layer of complexity.
The proposal aims to simplify this. It suggests that "processing of personal data on and from terminal equipment" (i.e. connected devices such as phones and personal computers) should be governed only by the GDPR, removing overlapping obligations under the e-Privacy Directive. This change would significantly reduce complexity for organisations handling device-based data.
The proposal also clarifies the consent requirements for accessing personal data stored on terminal equipment, bringing these activities squarely within the GDPR's scope. Importantly, the proposal mentions a list of exemptions where access and processing of personal data stored on terminal equipment will be lawful without consent to the extent it is necessary for:
- "carrying out the transmission of an electronic communication over an electronic communications network;
- providing a service explicitly requested by the data subject;
- creating aggregated information about the usage of an online service to measure the audience of such a service, where it is carried out by the controller of that online service solely for its own use;
- maintaining or restoring the security of a service provided by the controller and requested by the data subject or the terminal equipment used for the provision of such service".
And what about cookie banners? The proposal aims to tackle "consent fatigue" by updating the rules to ensure users have provided meaningful consent and introduces a six-month cooling off period, meaning if a user rejects consent an organisation cannot reapproach them for at least six months.
Again, elements of this mirror changes at the UK level via the DUAA, i.e. increasing the exemptions from consent for low-risk analytics cookies (see our article).
So what: why this matters for your organisation
- Greater legal certainty: The proposal clarifies ambiguous areas of the GDPR such as the definition of personal data, pseudonymisation, and automated decision making reducing uncertainty for organisations and regulators.
- Harmonisation across the EU: By introducing common templates, methodologies, and criteria (e.g. for DPIAs and pseudonymisation), the proposal aims to eliminate fragmentation caused by differing national interpretations, making compliance simpler for cross-border businesses.
- Practical relief: Streamlined breach reporting, targeted derogations for technological use cases, and flexibility around privacy notices reduce unnecessary administrative burdens without compromising data protection standards.
- Future proofing innovation: By addressing AI development, the proposal seeks to balance strong safeguards with enabling technological progress, ensuring the law remains relevant in a rapidly evolving digital landscape.
The question is both: Does this go far enough to effect any meaningful change to the EU's regulatory burden in this area? But for some, does it go too far? We will have to see how the next 2-3 months play out in the EU's various legislative bodies.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.