The Digital Omnibus Regulation Proposal introduces a limited set of amendments to the GDPR affecting the practical operation of existing provisions. In substance, the amendments would (i) amend the purpose-limitation/further-processing rule in Article 5(1)(b); (ii) clarify the definition of personal data, introduce targeted refinements to transparency and access mechanics, insert a new Article 33a on “single entry point” breach notification and amend Article 33 accordingly, refine aspects of Article 22 GDPR in relation to automated decision-making, and, where storage of, or access to, information on end-user terminal equipment involves the processing of personal data, place that topic on an explicit GDPR footing via a new Article 88a. A further new Article 88b is intended to enable standardisation around machine-readable preference/consent signalling. The Proposal also inserts a new Article 88c on processing for development, testing, training, deployment and operation of machine-learning models, and a new Article 41a on implementing acts for pseudonymisation/identifiability criteria. The amendments are framed as operational and procedural streamlining, and do not purport to reopen the GDPR’s core principles or the structure of lawful bases and rights. However, several proposals would materially change how certain obligations operate in practice (notably personal data breach notification and the GDPR/ePrivacy boundary for terminal-equipment storage and access where personal data is involved).
Proposals most likely to be adopted and rationale
This section addresses, in turn: (1) codification of a relative, entity-specific concept of “personal data” (Article 4 GDPR); (2) targeted AI and AI-adjacent GDPR clarifications (including lawful basis and special category handling in defined circumstances); (3) amendments to personal data breach notification procedures; (4) amendments to automated decision-making provisions; and (5) measures intended to standardise aspects of data protection impact assessments (DPIAs). It also addresses the proposed Article 41a implementing-acts mechanism on pseudonymisation/identifiability.
1. Codification of a relative, entity-specific concept of “personal data” (Article 4 GDPR)
This change is intended to codify a relative, entity-specific approach to identifiability, reflecting GDPR Recital 26 (in assessing whether information is personal data, account should be taken of all the means reasonably likely to be used) and the European Court of Justice’s recent case law confirming that “personal data” is assessed by reference to the relevant controller’s realistic ability to identify the individual.
The Proposal amends Article 4(1) GDPR to clarify that information is not automatically “personal data” for every person or entity merely because another entity can identify the data subject. This codifies a relative, entity-specific approach to identifiability, consistent with Recital 26 (“means reasonably likely to be used”) and CJEU jurisprudence on identifiability and “means reasonably likely to be used” (including Case C-582/14, Breyer and Case C-413/23 P, EDPS v SRB).
Legislatively, the Proposal does this by adding three sentences to Article 4(1) GDPR clarifying that: (1) information relating to a natural person is not necessarily personal data for every other person or entity merely because another entity can identify that person; (2) information is not personal data for a given entity where that entity cannot identify the individual, taking into account the means reasonably likely to be used by that entity; and (3) such information does not become personal data for that entity merely because a potential subsequent recipient has means reasonably likely to be used to identify the individual.
The Commission must closely involve the European Data Protection Board (EDPB), to adopt implementing acts specifying means and criteria to determine whether data resulting from pseudonymisation no longer constitutes personal data for certain entities, having regard to the state of the art, and the EDPB must issue its opinion within an eight-week deadline. Given the constitutional sensitivity of scope questions, this route to implementing acts is likely to be scrutinised closely in trilogue for legal basis, limits, and safeguards (including to avoid de facto re-definition via secondary measures).
This reflects the relative approach developed in CJEU case law (and long signposted in Recital 26), including the decisions in Breyer and EDPS v SRB, and the Court’s more recent identifiability jurisprudence referenced in the proposal materials, and is intended to address the historically more “absolute” interpretation sometimes advocated in supervisory practice. If adopted, it would strengthen the argument that the same dataset (including pseudonymised data) may fall outside the scope of the GDPR for one holder who lacks reasonably likely means to re-identify, while remaining personal data for another holder who can re-identify (or can reasonably likely access the additional information needed).
Practical impact: this redefinition of “personal data” is most significant in multi-party data flows (for example, controller–processor chains, and data-sharing chains under the Data Act), where different actors have different abilities to re-identify an individual from data. The third new sentence appears designed to avoid a “knock-on” effect whereby data becomes personal data for the sender merely because the recipient can identify the individual (a point that has generated uncertainty in light of parts of the CJEU’s reasoning in cases such as Scaniaand SRB). However, it does not remove the need for careful role allocation and (where relevant) Article 28 data processing clauses analysis in multi-party chains. The Proposal does not eliminate practical legal questions that will need to be worked through in contracting and guidance, most notably: (a) what this means for Article 28 GDPR role allocation and processor terms in scenarios where the recipient cannot identify but processes on behalf of a controller who can; and (b) how any entity-relative identifiability clarification is treated in Chapter V GDPR transfer risk assessments where the exporter/importer have different practical means of re-identification.
In practice, the changes are designed to reduce uncertainty in multi‑party data flows (including data‑sharing and product/service ecosystems) where data may be personal data for one actor but not for another. For legal drafting and governance, the immediate implications are (i) more granular role analysis in chain processing and data-sharing ecosystems (including Data Act disclosure chains), and (ii) more explicit allocation, contractually and operationally, of who can identify, who holds the additional information, and who bears which GDPR duties as a result.
This clarification has direct contract and governance implications for data‑sharing arrangements (including under the Data Act). Organisations should now reassess whether recipients are receiving “personal data” for GDPR purposes, and where necessary realign contractual roles, transparency obligations and allocation of compliance responsibilities accordingly.
Note: this change is conceptually distinct from anonymisation. It does not redefine anonymisation, but clarifies when identifiability is not reasonably likely for a given entity (and therefore when the GDPR does not apply to that entity’s handling of the information).
2. AI-linked provisions (closely connected to scope and identifiability)
The Proposal also introduces new GDPR provisions that are expressly relevant to AI development and deployment. In particular, it would clarify (in the GDPR text) that legitimate interests may be relied upon for the development, training, testing and deployment/operation of AI models, subject to the standard necessity and balancing analysis, assurance of individuals’ right to object, and appropriate safeguards (including minimisation and measures addressing residual risks).
In particular, a new Article 88c (Processing of personal data for development, testing, training, deployment and operation of machine learning models) provides that such processing may be carried out on the basis of Article 6(1)(f) where it is necessary for the purposes of legitimate interests under Article 6(1)(f) pursued by the controller or by a third party, subject to a documented legitimate interests assessment (LIA), the right to object, and appropriate technical and organisational measures, including including measures addressing (among other things) output-related risks and re-identification risk. This is also expressly framed as being without prejudice to any EU or Member State laws that require consent in specific contexts.
In parallel, the Proposal would add targeted amendments under Article 9 in relation to special category personal data. First, it would permit (subject to strict conditions) the residual processing of special category data in AI development and operation where the controller does not aim to process special category data but such data are nevertheless processed, requiring state-of-the-art measures to prevent collection and minimise processing, identify and remove such data, and prevent it being used to produce outputs or being disclosed to third parties, with a limited exception where data removal would result in disproportionate effort. Second, it would introduce a derogation facilitating certain on-device biometric uses where the biometric data remains under the sole control of the user/data subject. These adjustments to Article 9 are likely to be politically sensitive, particularly the exception for disproportionate effort, and are therefore more likely to be tightened through clearer statutory criteria, evidential requirements, and safeguards.
3. Amendments to personal data breach notification requirements (Articles 33–34 GDPR)
The Digital Omnibus Regulation Proposal amends Article 33 GDPR and inserts a new Article 33a GDPR (single entry point) to standardise how notifications are made, without removing the obligation to notify supervisory authorities or affected individuals where the statutory thresholds are met.
In particular, the Proposal:
- amends Article 33 to extend the notification deadline to 96 hours;
- raises the notification threshold by requiring notification to the supervisory authority only where the breach is likely to result in a high risk to the rights and freedoms of natural persons (so the Article 33 trigger is aligned to “high risk”);
- provides for the adoption of an EU standardised reporting model including EU-level notification criteria and templates, prepared by the EDPB and adopted by the Commission by means of implementing acts pursuant to Article 291 TFEU;
- does not amend Article 34 GDPR; the duty to notify data subjects where a breach is likely to result in a high risk remains in Article 34 as currently drafted.
The Proposal also implements a single entry and coordinated reporting model requiring controllers to notify via the “single entry point” established under Article 23a of the NIS2 Directive once it is in place, and (until then) to notify the competent supervisory authority directly.
These amendments are not simply a procedural standardisation exercise. While the Commission frames them as simplification, the combination of an extended notification deadline and a “high risk” trigger for supervisory notification would, if adopted, represent a substantive shift in incident-reporting visibility relative to the current Article 33(1) risk threshold, that would, if adopted, materially reduce the volume and speed of supervisory visibility compared with the current Article 33(1) risk threshold, and would therefore change the practical supervisory detection and enforcement dynamic.
4. Amendments to automated decision-making provisions (Article 22 GDPR)
The Proposal replaces Article 22(1) and (2) GDPR to clarify the scope of the prohibition on decisions based solely on automated processing and, in particular, the interpretation of the exception for decisions “necessary for entering into, or performance of, a contract” under Article 22(2)(a). It is presented as targeting the interpretation of Article 22(2)(a) (contract necessity) rather than altering the structure of Article 22(2)(b) (Union/Member State law authorisation) or Article 22(2)(c) (explicit consent), and it preserves the safeguard measures in Article 22(3).
In substance, the Proposal would confirm that ‘necessity’ for contractual purposes is not defeated merely because (in theory) a similar decision could be taken by a human, and it adds an express “regardless of whether it would be possible to have a decision taken on a non-automated basis” clarification. It would seek to reduce divergent supervisory interpretations of when automated decision-making may rely on the contractual necessity exception, while keeping the Article 22(1) prohibition, the Article 22(2) exceptions, and the Article 22(3) safeguards intact.
These amendments are likely to be adopted because they clarify the scope of an existing exception and address divergent interpretations applied by supervisory authorities, without removing the prohibition in Article 22(1) or the associated data subject safeguards. They are framed as a clarification rather than an expansion of permissible automated decision-making.
5. Standardisation of DPIA procedures (Article 35 GDPR)
The Proposal amends Article 35(4)–(6) GDPR to standardise the identification of processing operations requiring a DPIA and the manner in which DPIAs are conducted.
In particular, the Proposal:
- requires the EDPB to prepare Union-level lists of processing operations subject to, or exempt from, DPIA requirements;
- provide for a common DPIA template and methodology, to be adopted by the Commission through implementing acts;
- provides for periodic review of those instruments.
Given the current Article 35(4)–(5) GDPR structure (national lists, consistency mechanism), negotiations are likely to focus on how Union-level lists interact with, replace, or constrain Member State lists and existing Article 64 GDPR consistency practice.
These measures are likely to be adopted as they do not affect the underlying obligation to carry out a DPIA, but instead address fragmentation and inconsistency of approach in current supervisory practice.
Proposals more likely to be challenged, or rejected and rationale
This section addresses, in turn: (i) the transfer of terminal equipment processing rules into the GDPR; (ii) proposed limitations on information, access obligations; and (iii) legal certainty, evidential burden, and supervisory-visibility issues arising from the drafting technique adopted by the Proposal..
1. The transfer of terminal equipment personal-data processing rules into the GDPR
The Proposal provides that, insofar as the storage of, or access to, information on end-user terminal equipment involves the processing of personal data, the applicable legal framework is the GDPR rather than sector-specific electronic communications legislation.
In practice, this is reflected through the insertion of new GDPR Article 88a (covering terminal equipment storage/access where personal data is processed) and new GDPR Article 88b (a mechanism intended to support standardised, machine-readable preference/consent signals).
The Proposal is also framed as addressing so-called “cookie consent fatigue” through (i) a requirement for a single-click refusal option where consent is relied upon, (ii) restrictions on repeat consent prompts for the same purpose for a defined period (6 months where the end user has refused consent for that purpose), and (iii) a move toward machine-readable preference signalling via browsers/operating systems over staged implementation timelines.
This aspect of the Proposal is more likely to be challenged or materially narrowed during the legislative process. The transfer affects an area that has long been characterised by a distinct regulatory approach under the ePrivacy Directive, including differentiated consent requirements and the involvement of electronic communications regulators alongside data protection authorities.
Concerns likely to be raised during negotiations include:
- whether relocating these rules into the GDPR alters the substantive standard of protection applicable to end users, even if formally presented as a change of legal basis rather than of substance;
- the allocation of supervisory competence, in particular the respective roles of data protection authorities and communications regulators; and
- the risk that the Proposal is perceived as resolving, indirectly and without full debate, issues that have remained contested in the context of the stalled ePrivacy Regulation.
For these reasons, while complete abandonment of this element is unlikely, it is probable that the final text will limit the scope of the transfer from the e-Privacy Directive to the GDPR more narrowly than originally proposed. The final text may also be adapted to include express safeguards or clarifications preserving the level of protection applicable to end-user terminal equipment and/or rely more heavily on recitals to constrain interpretation and enforcement.
2. Amendments to information and access obligations in defined circumstances
The Digital Omnibus Regulation Proposal amends Articles 13, 14 and 15 GDPR to introduce statutory limitations on the application of the rights to information regarding processing of personal data and right to access personal data obligations in defined circumstances, particularly where compliance would be impossible or would involve a disproportionate effort, subject to defined safeguards. These proposals are likely to attract heightened scrutiny in trilogue because they are presented as “simplification” but, in substance, widen controllers’ ability to withhold information or refuse/charge for access by expanding the circumstances in which controllers may limit or disapply those obligations, rather than by removing them outright. However, the breadth of the proposed amendments, and the discretion afforded to controllers, is likely to be controversial
In particular, the Proposal:
- broadens the practical availability of the existing Article 13 exemption by amending Article 13(4) GDPR to provide that the information requirements in Article 13(1)–(3) do not apply where (i) there are “reasonable grounds to expect” that the data subject already has the relevant information regarding processing of their personal data, and (ii) the processing is not likely to result in a high risk to the rights and freedoms of natural persons within the meaning of Article 35 GDPR, subject to defined exclusions (including where the processing involves disclosure to recipients/third countries or engages automated decision-making safeguards). This formulation is likely to be challenged on the basis that it lowers the practical standard for controllers (particularly in large-scale HR / platform contexts) to infer knowledge and thereby reduce transparency, unless coupled to stronger, objectively verifiable criteria and record-keeping duties;
- inserts a new Article 13(5) GDPR providing that information obligations do not apply where personal data are processed for scientific research purposes and disclosing information to the data subject would prove to be impossible or would involve “a disproportionate effort” or could “seriously impair the achievement of the objectives of that processing”, subject to the safeguards required by Article 89(1) GDPR and the implementation of appropriate alternative measures to protect data subjects’ rights and freedoms. This is likely to be narrowed unless the final text makes clearer (i) what constitutes “disproportionate effort”, (ii) what “alternative measures” are sufficient in practice, and (iii) how the controller evidences its assessment for supervisory authority scrutiny; and
- introduces a corresponding limitation in Article 15 GDPR, permitting restriction of the right of access in scientific research contexts on the same conditions and subject to the same safeguards.
The Proposal also expands and codifies the circumstances in which controllers may treat data subject access requests as excessive or abusive (including overly broad and undifferentiated requests and requests pursued for purposes unrelated to data protection, where the controller can evidence the abusive intent). The explanatory materials expressly reference the use of DSARs as a ‘back-door’ for discovery within the context of litigation or employment claims.
Under the Proposal, controllers would be expressly permitted either to refuse such requests or to charge a reasonable fee, while remaining responsible for demonstrating that the applicable threshold is met. This reform operates independently of the research-specific limitations introduced via Article 13 and 15 research provisions and the Article 89(1) safeguard framework and the two regimes should not be conflated.
However, this abuse concept is legally controversial because it risks reframing access rights as presumptively suspect in precisely the contexts where access is used to test lawfulness and accountability (including large-scale HR, adtech/platform, and algorithmic decision-making scenarios).
Notwithstanding this structuring, the proposed expansion of the “manifestly unfounded or excessive” concept is likely to be one of the most heavily contested aspects of the Omnibus package. Critics argue that it risks diluting a core enforcement right under the GDPR, diverges from existing Court of Justice case law emphasising the fundamental nature of access rights, and may disproportionately affect individuals seeking to understand or challenge the use of their data.
For these reasons, while some clarification of refusal thresholds may ultimately be retained, the current breadth of the proposals affecting Articles 13–15 GDPR is more likely to be narrowed, conditioned, or materially amended before adoption, and may remain vulnerable to post-adoption legal challenge if enacted in their current form.
3. Proposed limitations on information, access, and breach notification obligations
Building on the issues addressed in Sections A.3 and B.2 above, a further negotiation pressure point is the evidential burden, legal certainty, and supervisory visibility implications of limitation concepts assessed by a controller (including “reasonable grounds to expect” knowledge, “disproportionate effort”, and expanded excessive/abusive tests/controllers’ reliance on these exceptions).
Consistent with the Commission’s simplification approach, a key negotiation issue will be whether the operative text clearly establishes minimum evidential duties (contemporaneous records, reasoned decisions, auditability and retention of the justification) so that limitation decisions remain practically enforceable under Chapter VI GDPR.
In particular:
- With respect to Articles 13–15 GDPR, the Proposal introduces controller-triggered limitation concepts that are inherently evaluative (including “reasonable grounds to expect” prior knowledge and “disproportionate effort”), and which therefore raise legal-certainty questions about (i) what minimum evidence a controller must hold at the point of limitation, (ii) how those assessments are to be tested by supervisory authorities under Chapter VI, and (iii) the extent to which Member State and authority practice will diverge absent tightly defined statutory criteria.
- In particular, where the Proposal relies on relationship- or context-based qualifiers (for example, tests that depend on the nature of the controller–data subject interaction, the scale of processing, or inferences about what a data subject can reasonably be expected to know), this is likely to raise concerns as to consistency of application and evidential burden allocation in supervisory investigations. During negotiations, these provisions may therefore be characterised as enabling an erosion of routine transparency unless coupled with objectively verifiable criteria, express record-keeping duties, and safeguards addressing the asymmetry of information between controller and data subject.
- Similarly, the amendments affecting Articles 33–34 GDPR may be scrutinised not only as a change of threshold (supervisory notification limited to “high risk”) and timing (96 hours), but also for their effect on supervisory visibility and systemic detection.
For these reasons, this set of amendments is more likely to be tightened or further conditioned during the legislative process, through:
- more precise statutory definition of controller-assessed limitation concepts (including minimum evidential thresholds and contemporaneous documentation requirements);
- clearer linkage to existing GDPR risk and accountability mechanisms (including Article 35 where “high risk” is used as a gating concept); and
- additional safeguards preserving supervisory visibility and effective remedies, including auditability of refusal/limitation decisions and enforceable standards for standardised breach reporting.
4. Issues of legal certainty and enforceability arising from the drafting of the Proposal
A further set of concerns likely to arise during the legislative process relates to legal certainty and enforceability of several proposed amendments.
A number of the Proposal’s mechanisms rely on assessments to be made by controllers as the trigger for limiting or modifying the application of core GDPR obligations. These include assessments as to whether a relationship is “clear and circumscribed”, whether there are “reasonable grounds” to expect the data subject already has the relevant information, and whether compliance would be impossible or disproportionately burdensome.
From an enforcement perspective, these tests raise questions as to:
- the allocation of the burden of proof where a controller relies on one of the proposed limitations;
- the evidential standards to be applied by supervisory authorities when exercising their investigative and corrective powers under Chapter VI GDPR;
- the risk of divergent application across Member States, particularly in cross-border processing scenarios subject to the consistency mechanism under Articles 60–63 GDPR. The risk is greatest where a controller applies these judgement-based limitations across multiple Member States: if supervisory authorities take different views on what evidence is required, the one-stop-shop cannot deliver consistent outcomes.
In addition, where amendments introduce discretion without corresponding procedural safeguards, there is an increased likelihood of litigation before national courts and the EU Court of Justice (CJEU), particularly in cases involving refusals of access requests or failures to notify breaches.
As a result, these provisions are likely to be refined during negotiations to:
- clarify the evidential requirements applicable to controllers relying on the proposed limitations;
- tighten the interaction between the new tests and existing GDPR concepts, including proportionality and risk-based assessment;
- reduce the flexibility of interpretation in order to support consistent supervisory enforcement.
Denmark
France
Ireland
Spain
United Kingdom