Digital Omnibus: What Would it Mean for Competition and Privacy in Advertising?

In November 2025, the European Commission (“Commission”) proposed a “Digital Omnibus” regulation to amend several EU legislations.[1]

Although presented as a simplification exercise to strengthen Europe’s competitiveness, the proposal revisits key provisions of the General Data Protection Regulation (GDPR) and the e-Privacy-Directive concerning the use of personal data, including for advertising purposes. The changes could have far-reaching consequences for data-driven marketing and media companies. This is particularly evident in the joint comments on the draft legislation by the European Data Protection Board (EDPB) and the European Data Protection Supervisor (EDPS), issued on 10 February 2026. [2] Both bodies call for amendments that would make it even more difficult for ad-funded services in the “Open Web”. 

In February 2026, the Council, representing the EU Member States, adopted its initial position on the Commission’s proposal.[3] The European Parliament is expected to vote on the proposal in the second or third quarter of 2026. 

This article provides a general overview of the Digital Omnibus, before outlining and discussing the key proposals affecting advertising and media markets, and the current state of the debate. 

A) Draft Digital Omnibus: an overview

The “Draghi”[4] and “Letta”[5] reports on (weak) EU competitiveness highlighted a need for legal simplification. With its 153-pages Digital Omnibus, the Commission aims to simplify the law by reviewing and consolidating existing digital legislation. The objective is to create a more cost-effective and innovation-friendly regulatory environment, without undermining the agreed objectives and high legal standards of existing legislation.

To this end, the Digital Omnibus merges provisions of the “Data Governance Act”, the “Free Flow of Non-Personal Data Regulation” and the “Open Data Directive” into a single, restructured and updated “Data Act”. Moreover, the “Platform-to-Business Regulation” is to be repealed on the (incorrect[6]) assumption that its main provisions have been rendered redundant by the “Digital Services Act”.  

While also addressing the use of personal data for AI training, the Digital Omnibus is accompanied by a second “Digital Omnibus on AI” package[7] aimed specifically at amending the “AI Act”.

At its core, however, the Digital Omnibus proposes changes to the GDPR and the ePrivacy-Directive. It acknowledges that the current approach to data protection has led to a “consent fatigue generated by cookie banners.”[8] To address this issue, a re-definition of “personal data” is proposed (see below at B). Certain types of data processing, presumed to be low-impact and low-risk, would no longer require consent (see below at C). Where consent remains necessary, additional requirements are proposed for the storage and use of stored data (below at D). In addition, browsers would be required to enable end users to grant or reject consent in a centralised manner, meaning that a single decision at the browser-level would apply across all web services (below at E).

B) Changes to the definition of personal data

The most profound proposal is to re-define what constitutes “personal data”, which trigger the requirements of the GDPR. 

I. Proposed changes 

    The Commission proposes to clarify in Article 4 GDPR that “information shall not be personal for a given entity where that entity cannot identify the natural person to whom the information relates, taking into account the means reasonably likely to be used by that entity”. It also proposes empowering the Commission to adopt implementing acts specifying when pseudonymised data no longer qualifies as personal data.

    The Council rejects this change. It agrees that the European Court of Justice’s recent ruling in EDPS v SRB[9](which held that data is personal only if the data subject is identifiable) calls for clarification of the GDPR. But the Council considers such clarification a task for the EDPB. In the Council’s view, the EDPB should specify when a person is identifiable and which means are reasonably likely to be used for identification. This should include criteria for determining when pseudonymised data may no longer be personal data for certain entities. The core issue would then be one of competence: should clarification come from the Commission through implementing acts, or from the EDPB through guidelines?

    Unsurprisingly, EDPB and EDPS support the Council’s view. They argue that the Commission’s proposal could narrow the GDPR’s scope and weaken the protection of the fundamental right to data protection. Regarding the proposed implementing acts, they warn that the Commission could effectively redefine, at least in part, the GDPR’s scope of application. Such acts might also increase legal complexity, contrary to the aims of the Omnibus proposal. They consider it preferable for the EDPB to issue further guidance (something it already intends to do), particularly on how pseudonymisation affects the classification of data, taking into account relevant CJEU case law.

    II. Assessment 

      The Commission seeks to better balance legitimate business interests in using data to improve services with the need to protect users’ privacy. The EDPB and EDPS appear too quick to reject any narrowing of the concept of “personal data”. If the GDPR is to function as a risk-based framework, a more differentiated approach is justified.

      Where identification is not reasonably possible for a specific controller, the risk to individuals is significantly reduced. In such cases, extending the full scope of the GDRP obligations uniformly to all entities, regardless of their actual ability to identify individuals, is disproportional. 

      Overprotection for its own sake risks creating unnecessary compliance burdens, weakening competitiveness without delivering meaningful privacy gains. A definition aligned with realistic identification risks could particularly benefit smaller publishers and ad-tech intermediaries that process pseudonymised or anonymised data without access to identity-resolution infrastructure, without this undermining fundamental rights. Although additional guidance from the EDPB may be helpful, it cannot offer the same level of legal certainty as a legislative amendment.

      The Commission’s proposed amendment could incentivise the responsible use of pseudonymisation techniques that effectively prevent (re-)identification. To further reduce the proliferation of consent banners, the concept could even be extended to anonymised data, not only pseudonymised data.

      At the same time, the power to adopt implementing acts should be used prudently. While it may enhance clarity, overuse could lead to regulatory instability and added complexity. Ideally, the Commission cooperates with the EDPB. 

      C) Changes to consent requirements for data processing 

      1. Processing of personal in the terminal equipment

      1) Proposed Changes 

      The Commission proposes introducing a new Article 88a GDPR on the “processing of personal data in the terminal equipment of natural persons”. This would effectively incorporate the current cookie and tracking rules of the ePrivacy Directive into the GDPR.

      As under the existing ePrivacy framework, Article 88a(1) would make consent the default requirement for storing personal data on a user’s device or accessing data already stored there.

      Article 88a(3), however, provides that consent is not required where such processing is strictly necessary for one of the following purposes:

      1. transmitting an electronic communication over an electronic communication network;
      2. providing a service explicitly requested by the data subject;
      3. creating aggregated information about audience measurement data, where this is carried out by the controller solely for its own use; 
      4. maintaining or restoring the security of a service, as requested by the user or the terminal equipment used for the provision of the service. 

      2) Assessment

      a) Inconsistent integration of the ePrivacy-Directive

      Integrating the provisions on access to end-user-devices from the ePrivacy-Directive into the GDPR relatively unchanged poses multiple problems.

      It risks creating new legal ambiguities and administrative burdens. Article 5(3) ePrivacy-Directive would still apply to the accessing of non-personal data, while the GDPR would govern the accessing of personal-data. Data controllers would need to determine the nature of the data to know which framework applies. As concluded by the EDPB and EDPS[10], this contradicts the goal of simplification. Moreover, the current proposal would lead to a paradoxical result: Access to non-personal data would fall under the stricter regime of the ePrivacy-Directive.

      Additionally, the proposal disregards certain fundamentals of the GDPR. Accessing end-user devices does not equal accessing data of a single identifiable, natural person as e.g. multiple people might use the same device. Therefore, this form of data access poses a lower risk than others. Always making its legality dependent on prior consent contradicts the GDPR’s risk-based framework, especially in view of Article 6 Nr. 1f GDPR, which allows the processing based on legitimate interest. Taking into account these fundamental principles and the need for compliance simplification and a reduction of “cookie fatigue”, a possible solution could be to permit low-risk processes without consent under Article 6 Nr. 1f GDPR. However, the Commission decided to propose specific exemptions.

      b) Inconsistent consent exemptions 

      In principle, clearly defined consent exemptions enhance legal certainty and are welcome. However, the current list raises serious concerns – not primarily for privacy, but for competition. In an effort to reduce the number of disliked consent banners, the Commission appears to have tried to identify data processing activities it considers low risk to privacy. It seems to assume that risk is limited where a data subject has requested a service and where the controller uses the data only internally, without sharing it with third parties.This poses two problems. 

      Firstly, it ignores other low-risk forms of data access and processing such as contextual ads, frequency capping, traffic validation or fraud detection which are essential for an open, ad-financed internet. Not including these purposes seems especially perplexing when considering that the Commission proposed an exception to the consent-requirement for the processing of personal data in the context of the development and operation of AI.[11] Using data for AI-training or -output poses a much greater risk to privacy than the limited processing used to provide ad-funded publisher offerings. This is incompatible with the risk-based approach the GDPR should embody. 

      Secondly, there is no evidence that large-scale company-internal collection, storage, and processing of personal data is less risky or less intrusive than sharing a portion of such data between companies. Economic reality suggests the opposite. The most privacy-invasive practices have often been carried out by large digital firms that combine and analyse user data across multiple services to build detailed profiles for micro-targeting. A digital gatekeeper that may monitor logged-in users across several devices and services possesses such vast amounts of first-party data that it can create far more detailed and problematic user profiles than any small or medium sized publisher or advertiser ever could, even one with access to third-party data.[12] As long as company-internal micro-targeting remains unaffected, the assumption that merely limiting cross-site data exchanges would meaningfully reduce ‘surveillance capitalism’ appears over-optimistic at best – and, at worst, risks functioning as a competition shield for designated gatekeepers. 

      Contrary to the expectations originally associated with the GDPR[13], the legislation has not reduced the market power of major technology companies such as Google or Meta. Instead, it has strengthened their position.[14] One reason is that the GDPR has increased the competitive value of access to large volumes of data. The larger a platform, the more users it attracts, the more intensively they engage, and the more dependent they become. This makes it easier for the platform to gather extensive first-party data, regardless of consent requirements, and reduces its reliance on third-party data sharing.[15]

      As a result, the GDPR’s consent requirement for data sharing has affected small and medium-sized companies more strongly. These firms typically have access to less first-party data and depend more on external data sources. The rules have therefore reinforced the relative data advantage of large platforms.

      The proposed consent exemptions risk further increasing this imbalance, to the benefit of the largest digital companies, most of which are not based in Europe. This will not increase but further weaken EU competitiveness. This concern is particularly evident in the proposed exception for in-house audience measurement.

      II. Consent exception for in-house audience measurement (only)

      1. Proposed changes

      Article 88(a)(1)(3) GDPR (new), cited above, proposes a consent exemption for audience measurement. However, the exemption is very narrowly defined. 

      First, it is limited to “aggregated information.” The term is not defined in the GDPR. It is generally understood to mean compiled individual data points presented in summary form. This suggests that collecting data on overall website views would not require consent. By contrast, collecting data on a specific website visit would still require consent, as such data is individual rather than aggregated. More detailed information about a single user, for example data that enables deduplication or cross-site measurement, would likely fall outside this exception.

      Second, and more importantly, the exemption applies only where audience measurement is carried out by the controller of the measured service and “solely for its own use.” This wording excludes tools that operate across multiple websites or media. As a result, many commonly used audience-measurement solutions would remain subject to consent requirements. At the same time, measurements conducted within a large digital ecosystem controlled by a single operator would not.

      The EDPB and the EDPS support such strict limitations for consent exceptions. In their view, the exception should be confined to anonymous aggregate data. Further processing or combination with other datasets should be prohibited, and the data should be used only internally. They also suggest clarifying that data may be collected either by a provider of an online service solely for its own use, or by a processor acting on behalf of that provider, such as a professional audience-measurement provider engaged by the controller.

      2. Assessment 

      The proposed changes would benefit digital gatekeepers, as designated under the DMA, to the detriment of all other media owners, in particular small and medium-sized publishers in the Open Web. 

      a) EMFA strengthens independent audience measurement 

        “Audience measurement” is defined as the collection, interpretation or other processing of data about the number and characteristics of users of media services for the purposes of decisions regarding advertising allocation, pricing, purchases or sales.[16]  

        As highlighted in the Commission’s impact assessment for the European Media Freedom Act (EMFA), “audience measurement is of key importance for the media and advertising ecosystem, being the core tool for understanding the market dynamics, calculating advertising prices, allocating advertising revenue, and planning the content production in accordance with the preferences of the audiences.”[17]

        The relevance of audience measurement stems from the fact that advertisers compare sellers of ad space by their ability to reach the intended audience at the right time with the right marketing message. The primary basis for such comparison is the number of unique users within a defined target audience that a channel reaches and the frequency of such contacts. 

        Sellers of ad space have an incentive to inflate the performance of their ad channel. To counter this, in Europe, traditionally all affected market players from the sell side (i.e. media owners), the buy side (i.e. advertisers) and their agencies have come together to self-regulate their industry. They agreed upon industry standards for audience measurement and organised self-regulated bodies, Joint Industry Committees (JICs) to carry out and independent measurement.

        This approach has worked well for traditional media for decades. However, an independent cross-site measurement of online advertising has faced one “main issue”[18]: the reluctance of international players, in particular Google and Meta, to have their ad-financed platforms measured by agreed upon industry standards through independent organisations like JICs. As explained in Recital 69 of EMFA, “certain new players [..] do not abide by the industry standards or best practices agreed through industry self-regulatory mechanisms and provide their proprietary measurement services without making available information on their methodologies.”

        Regrettably, more often than not such proprietary measurement services are abused to inflate the performance of the platform, and degrade that of rival media services.[19]  As such platforms measure their reach and usage internally, and do not share all raw data, advertisers quickly find themselves at the mercy of the platforms when it comes to the timing, methodology, depth and accuracy of reporting.   

        To counter such harmful development and foster the ability of advertisers to make informed choices, Article 24 EMFA was enacted. It requires that providers of audience measurement systems shall ensure that their systems and the methodology used “comply with the principles of transparency, impartiality, inclusiveness, proportionality, non-discrimination, comparability and verifiability.” The overall objective is to promote neutral cross-media measurement by JICs, instead of non-transparent proprietary measurements within “walled-off” data silos, in particular the ecosystems operated by designated DMA gatekeepers Google, Meta, Amazon and Apple. 

        The Digital Omnibus would do the very opposite: weaken independent cross-media measurement, and strengthen biased cross-company measurement within the ecosystems operated by digital gatekeepers.

        b) Draft Digital Omnibus weakens independent audience measurement  

        According to the proposed Article 88a(1)(c), the creation of “aggregated information” for audience measurement would not require consent only “where this is carried out by the controller solely for its own use”, i.e. for internal purposes. All other audience measurement – in particular across websites, devices, or media, as typically conducted by JICs – would continue to require consent.

        This creates a structural asymmetry distorting competition. Proprietary, opaque and unverified measurement systems would benefit from a consent exception, even where data is collected and analysed across multiple devices and services within a single ecosystem, such as that of a digital gatekeeper. By contrast, a JIC applying transparent, impartial and verifiable standards agreed across the industry would require consent even if measuring audiences across only two small publishers.

        The economic implications are significant. The ability to provide reliable audience measurement is a central competitive factor in advertising markets. Advertisers allocate budgets to platforms that enable effective targeting and performance measurement. Under the Digital Omnibus, large “walled-garden” platforms could offer seamless (though biased) cross-service measurement across their ecosystems without consent, “solely for their own use”, including the over-attribution of their ad performance. At the same time, small and medium-sized publishers would struggle to demonstrate their net reach and advertising effectiveness, as they could not even rely on JICs to provide de-duplicated cross-site measurement without obtaining consent.

        If SME publishers are confined to isolated, purely internal aggregated data while large platforms retain integrated ecosystem-wide insights, advertising budgets will predictably shift toward those platforms. The likely result is further market concentration in favour of Google, Meta and Amazon, contrary to the Commission’s stated objective of strengthening competitiveness. This becomes even more likely when considering the realities of audience measurement: SMEs and start-ups often externalise measurement tasks to specialist providers. If the current proposal become law, they would, in effect, be banned from conducting and providing audience measurement services for their advertising customers, as internalising the processes would often prove to be too expensive.

        By limiting the consent exception to measurement for the controller’s own use, the proposal strengthens proprietary systems within “walled-gardens” while weakening independent, cross-site and cross-media measurement on which the Open Web and the traditional media owners depend. It facilitates (self-serving) intra-platform measurement but restricts data sharing necessary for neutral, industry-wide standards. If walled gardens can measure audiences more granularly and effectively than Open Web publishers, they will exploit this advantage commercially – in effect “grading their own homework”.[20] The proposal would thus entrench proprietary systems, weaken JICs, and undermine the objective of Article 24 EMFA, while eroding trust and diminishing the ability to assess media pluralism independently.[21]

        A more proportionate solution would allow limited, de-duplicated cross-site measurement under strict safeguards. Rather than a binary approach of permission or prohibition, the framework should enable privacy-friendly access to data where appropriate risk mitigation measures are in place. Moreover, a risk-based approach would focus on the nature on the data processing and not the number of actors involved.[22] Without such calibration, European publishers lacking integrated ecosystems will face a structural disadvantage.

        One proposed alternative is to empower the Commission to initiate and supervise a standardisation process defining binding technical and organisational standards for audience measurement.[23] Developed in close cooperation between competition and data protection authorities, such standards could reduce compliance complexity and mitigate the risks of gatekeeper-controlled metrics. Impartial and objective standards, as envisaged by Article 24 EMFA, would help restore a level playing field.

        However, in Member States where industries have established JICs, these bodies are well placed to develop and implement standards consistent with Article 24 EMFA. Rather than creating new standards centrally, a more suitable approach would be to require digital gatekeepers such as Google and Meta to subject their services to the audience measurement standards agreed within JIC frameworks, and to ensure that they share the relevant audience data in a verifiable manner with them.[24]

        III. Exception to the consent-requirement for contextual ads

        During the call for evidence on the Digital Omnibus, several stakeholders proposed introducing an exception to the consent requirement for contextual advertising.[25] Although the Commission did not adopt this proposal, similar calls were echoed by the EDPB and the EDPS. They suggest including an additional exemption in Article 88a(3) GDPR to encourage less intrusive online advertising formats that do not retain data or link it to an individual’s past or future activity.[26]

        On balance, the authorities’ intention to introduce an additional exception for advertising on web publishers is welcome. However, limiting exemptions exclusively to contextual advertising risks distorting media incentives and overlooks the inherent economic limitations of contextual advertising compared to behavioural advertising.

        If only contextual advertising benefits from regulatory facilitation, publishers may be incentivised to prioritise “brand-safe” or commercially attractive content, while underinvesting in news coverage, investigative journalism, niche reporting, or foreign affairs correspondence – types of content that are less suitable for contextual campaigns.

        Ultimately, the issue is one of proportionality. Behavioural advertising may present higher risks than contextual advertising, but those risks vary depending on factors such as the categories of data used, retention periods, data combination practices, and scale. A differentiated, risk-based regime that permits privacy-enhanced behavioural advertising under strict safeguards would better reconcile fundamental rights protection with media sustainability than a categorical preference for contextual advertising.

        D) Consent management 

        1. Proposed changes

        The Commission’s proposal maintains the existing requirements for a valid user consent, reiterating that refusing consent must be as easy as granting it. 

        In addition, the Digital Omnibus proposal introduces a new Article 88a (4) GDPR, setting out further requirements where data is already stored based on consent: 

        1. the data subject shall be able to refuse requests for consent in an easy and intelligible manner with a single-click button or equivalent means; 
        2. if the data subject gives consent, the controller shall not make a new request for consent for the same purpose for the period during which the controller can lawfully rely on the consent of the data subject;
        3. if the data subject declines a request for consent, the controller shall not make a new request for consent for the same purpose for a period of at least six months.

        In practice, point a) obliges controllers to provide a “reject all” button at the first level of the consent management system. How the prohibition pursuant to point b), to make a new request while he may rely on consent previously granted, shall work in practice is unclear. A company may rely on consent by change or add features of its offerings or add further business partners, which would require new consent requests (which the proposal bans). The proposal in point c) could mean that, following a refusal, a publisher would have to wait at least six months before requesting consent again, even if it has significantly modified or expanded its services in the meantime. 

        The EDPB and EDPS support Article 88a (4) but recommend clarifying the duration during which a controller may lawfully rely on consent (under point b). They also note that a refusal of consent would need to be recorded, which requires accessing and storing information on the user’s terminal equipment. To avoid legal uncertainty, they therefore urge the Commission to introduce an explicit exception to the consent requirement for the limited purpose of recording a refusal. Such recording should rely only on generic information, such as a flag or code, rather than a unique identifier, in order to safeguard user privacy.

        II. Assessment

        a) The consent ban paradox 

        The proposed Article 88b(4) appears conceptually flawed. Reducing “cookie fatigue” is an understandable objective. However, the purpose of privacy and data protection laws is not to spare data subjects from being prompted to make an informed decision. This, however, is all that the proposed changes would achieve. They prohibit controllers from requesting consent and oblige them instead to allow a “reject all” approach. This serves nobody’s purpose, apart from those few not dependent on consent to target audiences.  

        It would contradictory that the legislator aims to promote a high protection of data privacy through a consent-requirement but at the same time increases the difficulty of easily requesting consent. Either consent as a protection mechanism is improved or the Commission should consider alternatives to consent requirements, such as relying on legitimate interest according to Article 6 Nr. 1(f) in combination with transparency requirements and opt-out possibilities. One cannot demand an informed consent but prohibit to request it. 

        a) Single-click buttons (point a)

        The obligation to introduce a “single-click button” to facilitate refusing consent in an easy manner (point a) serves no purpose. 

        A single-click refusal button decreases user autonomy rather than promoting it. To enable convenient decisions, does not necessarily mean to enable genuine user choice. It might be more convenient to stick to default settings, make a single decision for all services[27] or reject every data usage. However, behavioural economics and antitrust enforcement have taught us that such “simple” user decisions are based on (often artificially created and exploited) biases and, in fact, the opposite of an informed decision-making process and genuine choice. The funding of digital services is too complex for a “single-click” approach of “all or nothing”; and the law should not assume otherwise. 

        A “reject all” button, without any means to differentiate, can quickly give the impression that despite refusing any data processing, the user may still receive the full set of services. Yet this is commercially unfeasible. In economic reality, decisions regarding the use of a service in exchange for data processing entail complex trade-offs on the users’ part. They have to weigh up on a provider-by-provider basis, whom to entrust data in exchange of free content or content at a reduced subscription price. Incentivising to cut these complex decisions short with a “single-click”, in effect, reduces genuine user choice, rather than protecting privacy or consumer choice. 

        Just take the example of “pay-or-consent” models used today by many publishers, especially press publishers, to fund their offerings. With these models, users can either consent to data processing for interest-based advertising (“Consent”) or pay a fee to access the content (“Pay”). However, a mandatory “reject all” button would not allow a binary “either consent or pay” choice, effectively amounting to a ban of such important business model. If users can reject all processing without an option to choose between consent or payment, the model no longer works. 

        Advertising budgets for traditional media are swindling. Online advertising needs to compensate for such decline. The European legal framework needs to enable such shift. The Digital Omnibus would do the opposite: make it even more difficult to fund content. Business-models vital for preserving a secure financing method for media providers and by extension media plurality must not be prohibited.

        The sole beneficiaries would be actors who already have significant market power and omnipresence. Operators of core platform services, on which end users depend, have sufficient leverage to extract any data they need from their users. 

        b) Prohibition of consent requests (points b)-c))

        For similar reasons, also a prohibition to make a new request for consent for the same purpose (point b) or even to make any request following a rejection for a rigid six-month (point c) is not compelling. 

        User preferences can change quickly, especially if data processing purposes and partners, the possible reasons for a consent-denial, are modified. A lengthy lock-in period could unnecessarily restrict publishers’ ability to re-engage with users, particularly in rapidly evolving service environments. A shorter, proportionate timeframe would better reflect a risk-based approach while preserving flexibility and preventing abusive re-prompting.

        The EDPB/EDPS suggestion to define more clearly the period during which a controller may rely on consent is welcome, as it would reduce legal uncertainty and administrative burdens. Likewise, explicitly permitting limited data access and storage for the sole purpose of recording a refusal of consent would avoid further ambiguity. This recommendation should therefore be adopted.

        E. Centralised consent management at browser level 

        1. Proposed changes

        The Commission proposes adding a new Art. 88b GDPR, which states in its paragraph (1) that “controllers shall ensure that their online interfaces allow data subjects to: 

        • give consent through automated and machine-readable means, provided that the conditions for consent laid down in this Regulation are fulfilled;
        • decline a request for consent and exercise the right to object [to data processing for direct marketing purposes] pursuant to Article 21(2) through automated and machine-readable means.”

        The Commission shall request a European standardisation organisation to draft standards for the interpretation of machine-readable indications of data subjects’ choices. Plus, providers of web browsers, which are not SMEs, shall provide the technical means to allow data subjects to exercise the above rights, within a period of 48 months following the data of entry of the Digital Omnibus (Article 88b (6) and (7).

        Crucially, all data “controllers shall respect the choices made by data subjects” in the above manner, i.e. through automated and machine-readable means. 

        In practice, this means that (non-SME) browsers will invite users to set their preferences once, the browser will then communicate those preferences to every site they visit, and controllers (of those websites visited) would be legally obliged to respect such preferences. As a result, any rejection of consent to data processing for advertising purposes, and any objection to data processing for direct market purposes pursuant to Article 21(2) GDPR, expressed through the interface of a browser will automatically be binding for any website that the user visits via this browser. In contrast, a consent that the user grants via the browser can only be relied upon by a visited website, if it fulfils the strict requirements for a valid consent under the GDPR.

        To alleviate the negative effects that can be expected from such change on the revenue of ad-funded web publishers, the Commission proposes that the above provisions “shall not apply to controllers that are media service providers when providing a media service”, Article 88b(3) GDPR new.     

        EDPB and EDPS strongly support this proposal but recommend deleting the media exception as any exception would diminish the proposals’ benefits to controller compliance, making user choices effective in practice and combating cookie-fatigue.

        II. Assessment 

        In theory, centralised consent systems like browser- or OS-level signals, can simplify user interaction. Instead of having to express preferences site by site basis, users could set consent or objections once for all services. 

        a) Universal browser-level decisions are neither specific nor informed

        A universal browser-level toggle would apply before users interact with a website or receive information about who collects their data and why. Consent would thus become a binary technical setting, detached from context. This contradicts Article 4(11) GDPR, which requires consent to be “specific”, “informed” and “unambiguous”. Even if granted at the browser level, such consent would likely be invalid for individual websites, which would still need to request proper consent under GDPR. Consequently, cookie banners would not be reduced; unless users universally reject all processing. In that case, privacy may be preserved, but effective data use would be blocked, reducing user choice and the utility of online services. 

        Detaching consent from the user-service relationship also prevents publishers from explaining the value of user data for the service. Investments in privacy-enhancing practices or transparent communication would not influence user decisions. Publishers with positive relationships with users could not benefit from that trust. High switching costs mean users are unlikely to adjust their universal consent settings for different websites. As a result, even one or two untrusted sites could cause widespread consent denial, harming the ability of trustworthy publishers to monetise content, a critical revenue source, potentially leading to market exits and reduced media pluralism.[28] Considering that users are more likely to consent to trusted sites, preventing these publishers from receiving consent is disproportionate to the privacy risk. [29]

        b) Universal browser-level user control further strengthens gatekeepers 

        Privacy controls at the browser or OS level also create significant competition risks. The largest browser, Chrome, is operated by Google. Nearly all other browsers, except SMEs, are commercially linked to Google through revenue-share agreement for the pre-installation of Google Search (e.g. Safari, Mozilla Firefox, Opera).[30] Under these agreements, the browser earns more if advertising spend on Google Search increases, creating a strong joint commercial interest to make non-search-based advertising as unattractive as possible in terms of targeting and measurement capabilities. 

        Google and its partner browsers would therefore have both the incentive and the ability to use centralised consent management tools as a means to restrict independent actors’ access to data, reinforcing Google’s dominance in advertising markets. Foxes should not be given the task to guard the henhouse. If enacted in its current form, the proposal would require strict and comprehensive competition safeguards to prevent Google and its browser partners from deploying technical tools that further shield search-based advertising from competition through data-based alternative advertising models.

        Article 88b GDPR should therefore be removed altogether. If this wins no majority, at the very least the exception for media service providers must remain and widened. Without it, independent publishers that rely on advertising revenue would be disproportionately affected. Unlike subscription platforms or integrated ecosystems, many publishers cannot monetise non-consenting users effectively. A uniform technical mechanism that prevents meaningful consent dialogue would likely shift further advertising budgets toward dominant platforms operating outside browser-level constraints.

        Overall, the proposal reflects a one-size-fits-all approach that does not differentiate between online publishers. This risks further entrenching market power with a few dominant, non-EU companies. If enacted, retaining the media exception is indispensable to prevent unintended harm to competition. 

        F. Conclusion on the interplay of privacy and competition policy 

        The Digital Omnibus presents both an opportunity and a risk for ad-funded media. Simplification and clarification are legitimate objectives. However, reforms affecting data-driven advertising must avoid a one-size-fits-all approach that ignores differences between controllers and thereby further weakens competition. Rigid prohibitions without a proportional assessment, as well as measures that unintentionally strengthen dominant platform ecosystems at the expense of European publishers, should be removed.

        Most problematically, the advertising-related provisions of the Digital Omnibus appear to assume that first-party data processing, even when carried out by the largest companies, is inherently low-impact and low-risk. At the same time, any third-party data sharing, even by the smallest players and for mere measurement purposes in the public interest, is treated as high-impact and high-risk. Under this logic, gatekeeper-internal data processing, such as for proprietary audience measurement, would no longer require consent. Yet the same type of processing across small websites, even if carried out by an independent JIC based on industry-standards, would still require consent. 

        This one-sided facilitation of first-party processing overlooks where the greatest privacy risks actually lie. Sharing data across independent operators is not inherently more risky to privacy than amassing data within a digital gatekeeper. In fact, the most significant risks stem from micro-profiling by large digital platforms, not from cooperation among small publishers trying to reduce their data disadvantages vis-à-vis gatekeepers. Moreover, this policy ignores its negative effects on competition. Competitiveness is not strengthened if designated gatekeepers benefit disproportionately more from consent exceptions for advertising services than small and medium-sized competitors. In this respect it is difficult to avoid the impression that European institutions fell victim to Big Tech lobbying.

        Data protection and competitiveness are not opposing goals. A carefully calibrated, risk-based framework can protect fundamental rights while preserving the economic foundations of independent media. If the Digital Omnibus fails to strike this balance, it may achieve legal simplification at the cost of competition and market diversity. The result would be weaker European competitiveness and reduced media plurality. Stakeholders should voice their views on the proposal to help prevent these negative effects before it is too late. Europe does not need another piece of legislation that disproportionately benefits the largest digital companies at the expense of smaller rivals. Instead, the legal framework needs to be improved to allow for more competition in online advertising, for it to fund pluralistic media. Against this background, it is to be hoped that the Digital Omnibus will not be driven solely by detached privacy or simplification objectives, but that competition lawyers will also have a hand on the wheel.


        * Thomas Höppner is a partner at Geradin Partners. Yannick Werle is a research assistant at Geradin Partners.

        [1] Proposal for a Regulation amending Regulations (EU) 2016/679, (EU) 2018/1724, (EU) 2018/1725, (EU) 2023/2854 and Directives 2002/58/EC, (EU) 2022/2555 and (EU) 2022/2557 as regards the simplification of the digital legislative framework, and repealing Regulations (EU) 2018/1807, (EU) 2019/1150, (EU) 2022/868, and Directive (EU) 2019/1024 (Digital Omnibus), COM/2025/837.

        [2] EDPB-EDPS JOINT OPINION 2/2026.

        [3] Council of the European Union, 2025/0360 (COD), Brussels, 20 February 2026, https://www.euractiv.com/content/uploads/sites/2/2026/02/EURACTIV_OMNIBUS-1.pdf

        [4] Mario Draghi, The future of European competitiveness, 2024. 

        [5] Enrico Letta, Much more than a Market, 2024.

        [6] While the P2P Regulation focuses on fairness and transparency in P2B business relationships, the DSA focuses on systemic risk, illegal content, user protection, and platform accountability. With Art. 4, 5, 9, 11-12, several core provisions of the P2B Regulation are not covered by the DSA. In particular, there is no equivalent for the crucial provisions on the restriction, suspension and termination of online intermediation services. If the P2B-Regulation is repealed, this will leave business users unprotected. 

        [7] Proposal for a Regulation amending Regulation (EU) 2024/1689 and € 2018/1139 as regards the simplification of the implementation of harmonised rules on artificial intelligence (Digital Omnibus on AI), COM/2025/836.

        [8] Digital Omnibus Proposal, Explanatory Memorandum, (n 1), para. 13. 

        [9] CJEU, judgment of 4 September 2025, C-413/23 P, C‑413/23 P: ECLI:EU:C:2025:645.

        [10] EDPB-EDPS JOINT OPINION 2/2026, para. 96.

        [11] Proposed Article 88c.

        [12] See AGCM, decision of 16 December 2025, Apple ATT, para. 415. 

        [13] See for instance Johnny Ryan, Why GDPR is Kryptonite to Google & Facebook on Anti-Trust, 10 October 2018, Brave Blog Posts expressing hopes the GDPR would neutralise Google’s data advantage; similarly, Ryan, Failure to enforce the GDPR enables Google’s monopoly, 18 February 2020, Brave Blog Posts.  

        [14] Geradin/Katsifis/Karanikioti, GDPR Myopia: how a well-intended regulation ended up favouring large online platforms – the case of ad tech, European Competition Journal, 2020, pp. 47-92; Gal/Aviv, The Competitive Effects of the GDPR, J of Comp L & Economics, 2020, pp. 349-391; Johnson/Shriver/Goldberg, Privacy & Market Concentration: Intended and Unintended Consequences of the GDPR, Management Science, 2023; Schmitt/Miller/Skiera, The Impact of Privacy Laws on Online User Behavior, Cornell University Working Paper, 2021

        [15] See Höppner/Westerhoff. The Role of Data for Competition in Digital Advertising, CPI Antitrust Chronicle, 2023 2(2).  

        [16] Article 2(16) EMFA. 

        [17] European Commission, Impact Assessment Report, Accompanying the Proposal for a European Media Freedom Act, COM(2022) 457, 16.9.2022, Part 1/3, p. 21. 

        [18] AGCOM, Sector Inquiry on Media Audience Measurement Systems (2017), at 17. 

        [19] See in detail Höppner/Westerhoff, Weaponized Opacity: Self-Preferencing in Digital Audience Measurement, CPI Antitrust Chronicle, 2025.

        [20] See Höppner/Westerhoff, Weaponized Opacity: Self-Preferencing in Digital Audience Measurement, CPI Antitrust Chronicle, 2025.

        [21] See AGF Videoforschung, Pressemitteilung vom 25.02.2026 “EU-Digitalgesetzgebung“.

        [22] The EDPB and EDPS also agree that data collected by a third-party on behalf of the site-provider should be legal based on this exception, see EDPB-EDPS JOINT OPINION 2/2026, para. 102.

        [23] Check My Ads Institute Comments on the Digital Omnibus Proposal, p. 22.

        [24] See Höppner/Westerhoff, Weaponized Opacity: Self-Preferencing in Digital Audience Measurement, CPI Antitrust Chronicle, 2025. 

        [25] Advertising Information Group, Response to the European Commission’s Call for Evidence, p. 5-6.

        [26] This proposition is supported by the “Check My Ads Institute”, ibid., p. 22.

        [27] See below at E) on the risks of centralised-consent management at browser level.

        [28] iab Europe also expects lower consent rates, iab Europe’s Position on the Draft Digital Omnibus and Digital Acquis, p. 7.

        [29] Kantar Media/iab Europe, Optimisation over reform, April 2025, p. 28-29.

        [30] See General Court, judgment of 14 September 2022, Case T-604/18, Google Android; European Commission, Case AT.4009, 18 July 2018, Google Android.  

        Authors

        Leave a Reply