
Introduction
The Digital Services Act: A Paradigm Shift in Platform Regulation
The European Union’s Regulation (EU) 2022/2065, the Digital Services Act (DSA), represents a landmark shift in the governance of the digital sphere. Fully applicable since 17 February 2024, the DSA establishes a harmonised legal framework intended to create a safer, more predictable, and trusted online environment across the single market.
The DSA introduces a co-regulatory model that imposes a tiered set of due diligence obligations on providers of intermediary services, including hosting services, online marketplaces, and social media platforms. These obligations range from basic transparency requirements to complex risk assessment and mitigation duties for the largest market players, the Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs).[1]
The DSA’s core objectives are to combat the spread of illegal content, protect fundamental rights such as freedom of expression, enhance the safety of minors online, and increase transparency in areas like content moderation and online advertising.
The Pivotal Role of Digital Services Coordinators
A key innovation of the DSA is the introduction of a two-tiered enforcement system. Under this system, VLOPs or VLOSEs are regulated mainly by the European Commission (EC), which has exclusive competence to enforce Section V of Chapter III of the DSA (essentially, obligations relating to systemic risks). The national Digital Services Coordinators (DSCs) are the primary point of contact for users and the enforcers of the DSA’s obligations on smaller platforms (and other obligations on VLOPs and VLOSEs).
Both levels are coordinated at the European level through the European Board for Digital Services (EBDS), an independent advisory group comprised of members from the DSCs and chaired by the EC. This structure is designed to ensure consistent application of the rules across the single market, facilitate the exchange of expertise, and support the EC in its direct supervision of VLOPs and VLOSEs. The effectiveness of this entire framework, therefore, rests heavily on the capacity and operational readiness of these national coordinators.
Overview of the First Annual Activity Reports
Article 55 of the DSA requires each DSC to publish an annual report on its activities. This analysis provides a consolidated review of the available 2024 reports, published for the most part this summer,[2] offering a first picture of the DSA’s translation from legal text to practical reality.
It covers reports from 24 of the 27 EU Member States.[3] These reports offer the first official, data-driven glimpse into the functioning of the DSA’s enforcement machinery on the ground.
Key Preliminary Findings
Based on the reports, the following key takeaways can be drawn about the current state of DSA enforcement:
- First, at the end of 2024, implementation was uneven across the EU, with some Member States having established legal and institutional frameworks, while others lagged significantly behind.
- Second, the user complaint mechanism under Article 53 emerged as the most active component of the new framework, with a substantial portion of complaints directed at VLOPs
- Third, Ireland emerges as the central hub for handling cross-border complaints, creating a structural “funnel” with potentially significant implications for regulatory capacity.
- Finally, other key enforcement mechanisms, such as the certification of out-of-court dispute settlement bodies and the provision of data access to vetted researchers, remain nascent or entirely non-operational, highlighting that the DSA’s complete toolkit is not yet in use.
The National Enforcement Landscape: A Patchwork of Progress
The full application of the DSA on 17 February 2024 was intended to be a harmonised starting point for a new era of digital regulation across the Union.[4] However, the first annual reports from the DSCs reveal that this was less a uniform launch and more a staggered start. The result is a patchwork of progress, where some DSCs were fully operational from day one, while others remained functionally paralysed for much, if not all, of the reporting period.
Chart 1 provides an overview of the status of DSA implementation in the 24 Member States in question at the end of 2024. It also shows the share of Member States that have designated one authority as the DSC and the sole competent authority under the DSA versus those that have divided the enforcement responsibility among several authorities in addition to the DSC.[5]
Chart 1: Status of DSC Designation and National Implementing Legislation in 2024


The review of annual reports also reveals considerable differences in the implementation of the national regulatory frameworks. Member States such as Austria, Denmark, Finland, and Italy had their frameworks in place by the February deadline, allowing their DSCs to commence their duties immediately. Germany followed shortly after with its comprehensive Digital Services Act (Digitale-Dienste-Gesetz, DDG) in May 2024.[6] In contrast, other Member States under review faced significant delays, which had consequences for the application of the DSA within their borders.
The reports from Bulgaria, the Czech Republic, the Netherlands, Luxembourg, and Spain underline some of the difficulties faced by DSCs due to the lack of a regulatory framework. The Bulgarian Communications Regulation Commission (Комисия за регулиране на съобщенията, CRC) explicitly states it was “prevented from operating and performing the coordinator’s functions under the Digital Services Act in full due to the lack of a law that would impose specific obligations“.[7] Similarly, the Czech Telecommunications Office (Český telekomunikační úřad, CTU) notes that although it was designated as the DSC, “the full authorisation requires adoption of a national law […] As a result, CTU does not currently have the competence to fulfil all the duties of the Digital Services Coordinator“.[8] The Dutch Authority for Consumers and Markets (Autoriteit Consument en Markt, ACM) operated under a temporary scheme allowing only for complaint handling,[9] while the Spanish National Markets and Competition Commission (Comisión Nacional de los Mercados y la Competencia, CNMC) reports it has “not yet been fully empowered, either in terms of regulations or resources, to take on these new functions“.[10]
This lack of empowerment meant these bodies could not certify trusted flaggers or out-of-court dispute settlement bodies, conduct formal investigations, impose sanctions, or even formally receive judicial orders under Articles 9 and 10. Thus, for much of 2024, the rights and obligations established by the DSA for providers and users in these jurisdictions were rendered temporarily unenforceable at the national level.
These delays led the EC, in April and July 2024, to initiate infringement proceedings against several Member States, including Belgium, Croatia, Luxembourg, the Netherlands, and Sweden, for failing to designate or empower their DSCs by the deadline.[11] Infringement proceedings are used by the EC to force Member States to comply with EU law. By June 2025, the EC had decided to refer Czechia, Spain, Cyprus, Poland, and Portugal to the Court of Justice for these persistent failures.[12]
The Voice of the User: Complaints under Article 53
In its first year, the most tangible and widely used feature of the DSA’s enforcement framework has been the user complaint mechanism established by Article 53. This provision grants recipients of intermediary services the right to lodge a complaint with the DSC in their Member State, alleging an infringement of the Regulation. The data from the first annual reports indicates a significant uptake of this right and suggests that Ireland will become a key Member State in DSA enforcement.
Chart 2 aggregates the complaint statistics from the 24 reporting DSCs, illustrating the volume and flow of user grievances across this sample of the Union.
Chart 2: Overview of Complaints Received and Processed by DSCs in 2024

The data reveals a disparity in complaint volumes. Germany’s DSC received the highest number of complaints (824), followed by Ireland (322) and the Netherlands (256). This may simply reflect demographics, but also perhaps a higher level of public awareness, the ease of use of complaint portals, or a combination of factors. At the other end of the spectrum, Bulgaria’s report of zero complaints is likely a direct consequence of its failure to properly implement the DSA and Croatia’s low number likely reflects its similar lack of full empowerment in 2024.[13] The Romanian and French DSCs did not report the number of complaints they received.[14]
Cross-border complaints: the ‘Irish Funnel’
The DSA operates on the “country-of-origin” principle, meaning that enforcement authority ultimately lies with the DSC in the Member State where the service provider has its main establishment. As many of the world’s largest technology companies, including nearly all designated VLOPs, have established their European headquarters in Ireland, this principle has created a powerful “funnel” effect, directing a large volume of complaints from across the EU to a single national regulator: Ireland’s Media Commission (Coimisiún na Meán, CnM).
The reports from other Member States illustrate this phenomenon, as shown in Chart 3.
Chart 3: Complaints Forwarded to Ireland by Selected Member States in 2024

This data shows that even in the first year of the DSA, a significant number of complaints against the world’s most significant online platforms are funnelled to the CnM from other DSCs. This raises potential questions about the long-term ability of the CnM to handle flows of complaints against the VLOPs (we have witnessed similar issues in relation to the GDPR). However, unlike under the GDPR, the competence to investigate complaints is shared between the CnM and the EC, the latter having sole competence to enforce Section 5 of Chapter III, which arguably concerns the most significant risks to European society (systemic risks).
Common Grievances and Enforcement Outcomes
The reports that provide substantive detail on the nature of complaints reveal a consistent set of user frustrations. The Austrian, Finnish, Greek, Italian, Swedish, and Portuguese reports identify three recurring themes:
- Inadequate Statements of Reasons (Art. 17): Users frequently complained that platforms suspended their accounts or removed their content without providing the clear and specific reasoning required by the DSA.
- Ineffective Internal Complaint-Handling Systems (Art. 20): Many complaints concerned the perceived shortcomings of platforms’ own internal appeal mechanisms, citing difficulties in lodging complaints or receiving a timely and reasoned decision.
- Lack of Contact Points (Art. 12): Users reported difficulties in communicating directly with service providers due to the absence of easily accessible and user-friendly points of contact.
The reports from Belgium, Austria, Finland, Italy, Portugal, and Sweden all note that complaints predominantly target VLOPs, such as Facebook, Instagram, and TikTok.
Despite the volume of complaints, enforcement outcomes in 2024 were modest.
Only a few DSCs reported engaging in any investigative activity during 2024. The Italian Communications Authority (Autorità per le Garanzie nelle Comunicazioni), AGCOM, took investigative steps in one case but did not find grounds to proceed with a formal investigation. Germany’s Bundesnetzagentur reported initiating four administrative proceedings against service providers, two of which were ongoing at the time of the annual report’s publication.
The actions taken by DSCs were generally limited to requesting information from providers and, in some cases, achieving voluntary compliance. For instance, the Greek Hellenic Telecommunications and Post Commission (Εθνική Επιτροπή Τηλεπικοινωνιών και Ταχυδρομείω, EETT) sent letters to providers established in Greece, two of which subsequently complied with the DSA’s provisions.[15] This reflects the nascent stage of enforcement, where DSCs are still establishing their procedures and focusing on initial compliance rather than punitive measures.
Complaints from public authorities, on the other hand, led to some enforcement action by DSCs. The Belgian Institute for Postal Services and Telecommunications (Belgisch Instituut voor postdiensten en telecommunicatie, BIPT) received a request from the Estonian DSC under Article 9 DSA to take action against Telegram for broadcasting prohibited Russian channels in violation of EU sanctions law. The BIPT contacted Telegram, which subsequently removed the channels in question. This highlights the importance of complaints from other authorities in ensuring compliance.
Overall, no infringement decision was adopted by any DSC, and no financial penalty resulted from a user complaint during the reporting period.
The Broader Enforcement Toolkit: An Early Assessment
Beyond the user complaint mechanism, the DSA provides DSCs and other stakeholders with a range of tools to foster a safer and more transparent online environment. These include the notification of judicial and administrative orders, out-of-court dispute settlement bodies, trusted flaggers, and data access for vetted researchers. The first annual reports show that the adoption of these mechanisms has been slow and uneven, with some remaining entirely dormant in 2024.
Judicial and Administrative Orders (Arts. 9 & 10): A Significant Transparency Gap
Articles 9 and 10 of the DSA aim to harmonise and bring transparency to the process by which national authorities and courts order intermediary services to remove illegal content or provide user information. These articles require the issuing authorities to transmit such orders to their national DSC, which then shares them with all other DSCs via a central system. The goal is to create a pan-European overview of such enforcement actions.
However, the data from the 2024 reports suggests this system is not yet functioning as intended. The figures reported are extremely sparse and inconsistent. The vast majority of DSCs, including those in Austria, Bulgaria, Croatia, the Czech Republic, Greece, Hungary, Ireland, Italy, Luxembourg, the Netherlands, Portugal, Slovenia, and Sweden, reported receiving zero or nearly zero orders. In contrast, Latvia reported receiving 288 orders under Article 9, Lithuania received 51, Germany received 53, Denmark and Romania both received four, Spain received two, and Finland received one.
This vast discrepancy is unlikely to reflect the true level of judicial and administrative activity across the EU, as it is highly improbable that courts and authorities in most Member States issued almost no relevant orders over the course of a year, while those in Latvia issued hundreds.
The reports themselves point to a more plausible explanation: a failure in the reporting pipeline. The Austrian DSC notes that its “national procedures for forwarding orders […] were still in preparation“.[16] The German report is even more explicit, stating that many orders, particularly in criminal contexts, are not transmitted due to national procedural law exceptions, and that the reported statistics “do not reflect all orders issued in Germany“.[17]
This indicates a significant procedural and, in some cases, legal disconnect between the authorities issuing orders and the DSCs tasked with collecting and disseminating them. Consequently, the numbers reported under Articles 9 and 10 are not reliable indicators of enforcement activity, but rather evidence of a systemic reporting failure.
Out-of-Court Dispute Settlement (Art. 21): A Slow Start
Article 21 of the DSA empowers users to challenge platform moderation decisions before an independent, certified out-of-court dispute settlement (ODS) body, offering a faster and more cost-effective alternative to judicial proceedings.[18] However, the 2024 reports reveal that this redress mechanism was far from being widely available by the end of 2024. As shown in Table 1, only six of the 24 reporting Member States had certified an ODS body by the end of 2024. The German DSC received five applications in total (but appointed only one), while most other countries, including Belgium, Bulgaria, Croatia, the Czech Republic, Denmark, Finland, Greece, Latvia, Lithuania, Luxembourg, the Netherlands, Portugal, Slovakia, Slovenia Spain, and Romania reported no applications and no certifications. This slow start suggests that building a network of independent and expert ODS bodies will be a key challenge for 2025 and beyond.
On this note, the Irish ODS body, Appeals Centre Europe (ACE), has recently published its first transparency report, which shows that its activity is picking up.[19] The ACE reports having received nearly 10,000 disputes concerning decisions by Facebook, Instagram, Threads, TikTok, and YouTube. Only around 3,000 complaints fell within the scope of ACE’s certification. Of those 3,000 complaints, 76% concerned content that users wished to have removed or restored, while 23% concerned account suspensions.[20] However, it also notes limited public awareness of ODS bodies and calls for better cooperation with platforms, including better informing their users about their existence.
Table 1: Certified Out-of-Court Dispute Settlement Bodies in 2024
| Member State | Name of Body | Date Certified |
| Austria | RTR’s Media Division | 24 October 2024 |
| Germany | User Rights GmbH | 12 August 2024 |
| Ireland | Appeals Centre Europe (ACE) | 26 September 2024 |
| Italy | ADR Center S.r.l. | 18 December 2024 |
| Malta | ADROIT | 10 July 2024 |
| Hungary | Online Platform Vitarendező Tanács | 29 August 2024 |
Trusted Flaggers (Art. 22): An Emerging Ecosystem
In contrast to the slow uptake of ODS bodies, the trusted flagger mechanism under Article 22 has seen more dynamic development. This provision formalises a system where expert entities—both public and private—can be certified by DSCs to have their notices of illegal content prioritised by online platforms. The 2024 reports show several Member States actively engaging with this co-regulatory tool, certifying a total of 19 organisations with diverse specialisations in 2024.
Table 2: Certified Trusted Flaggers in 2024
| Member State | Name of Flagger | Area of Expertise |
| Austria | Schutzverband gegen unlauteren Wettbewerb | Unfair competition, industrial property rights |
| Rat auf Draht gemeinnützige GmbH | Child protection, mental health of young adults | |
| Österreichische Institut für angewandte Telekommunikation (ÖIAT) | Personality rights, copyright, internet fraud | |
| LSG – Wahrnehmung von Leistungsschutzrechten GmbH | Copyright and related rights | |
| Kammer für Arbeiter und Angestellte für Wien | Consumer protection, data protection, privacy | |
| Denmark | The Danish Rights Alliance (RettighedsAlliancen) | Intellectual property violations, online piracy |
| Finland | Tekijäoikeuden tiedotus- ja valvontakeskus ry (TTVK) | Copyright |
| Pelastakaa Lapset ry (Save the Children Finland) | Child protection | |
| Somis Enterprises Oy (Someturva) | Not specified | |
| France | e-France (3018) | Illegal speech, child protection |
| Germany | Meldestelle REspect! (Jugendstiftung Baden-Württemberg) | Hate speech |
| Greece | FactReview | Not specified |
| Foundation for Research and Technology-Hellas (FORTH) | Not specified (operates saferinternet4kids.gr) | |
| Greece Fact Check | Not specified | |
| Hungary | Internet Hotline | Child protection, mental health, online harassment |
| Lithuania | CropLife Lithuania | Illegal products |
| Romania | Organizația Salvați Copiii (Organisation Save the Children) | Child protection |
| Institutul Național pentru Studierea Holocaustului din România “Elie Wiesel” (The ”Elie Wiesel” National Institute for the Study of the Holocaust in Romania) | Illegal speech | |
| Sweden | ECPAT Sweden | Child protection |
Austria, Finland and Greece led the pack in leveraging civil society and industry expertise. The areas covered – from child safety and consumer protection to intellectual property rights and hate speech – indicate the breadth of issues being addressed through this mechanism.
By the time this analysis is published, the number of appointed trusted flaggers has significantly increased, with over 40 trusted flaggers appointed across the EU.[21]
Vetted Researchers (Art. 40): A Stalled Transparency Tool
One of the most ambitious transparency tools in the DSA is Article 40, which is designed to grant vetted researchers access to data from VLOPs and VLOSEs upon request by a DSC, allowing the researchers to study systemic risks.[22] DSCs are responsible for designating vetted researchers.[23] By the end of 2024, the EC had yet to adopt a delegated act setting out the technical conditions for data access by vetted researchers, and therefore, no researcher was granted this status in 2024. One of the DSA’s cornerstone accountability mechanisms thus remained entirely non-operational throughout the first year of its application.
However, on 2 July 2025, the EC published the delegated act on data access under the DSA.[24] At the same time, the EC also launched the DSA data access portal, where researchers interested in accessing data under the new mechanism can find information and exchange with VLOPs, VLOSEs and DSCs on their data access applications.[25] Consequently, 2025 is expected to mark the adoption of this tool across the EU.
Comparison with EU Enforcement: The EC’s active engagement with VLOPs/VLOSEs
While the national DSCs were focused on establishing their authority and handling the first wave of user complaints, the EC was busy exercising its exclusive competence to supervise the additional, more stringent obligations imposed on VLOPs and VLOSEs. In contrast to national enforcement, the EC opened several investigations covering broad topics. No decision was, however, adopted in 2024.
Summary of Formal Proceedings Initiated by the EC in 2024
The EC moved decisively in 2024, opening formal infringement proceedings against several of the world’s largest digital platforms. These investigations represent the first major tests of the DSA’s most powerful provisions and signal the EC’s intent to rigorously enforce the new rulebook.
Table 3: Formal DSA Proceedings Opened by the EC in 2024
| VLOP/VLOSE | Date Proceedings Opened | Subject Matter |
| TikTok | 19 February 2024 | Protection of minors, advertising transparency, data access for researchers, as well as risk management of addictive design and harmful content[26] |
| AliExpress | 14 March 2024 | Risk management of illegal content (e.g., counterfeit medicines, non-compliant products), trader traceability, advertising transparency, recommender systems[27] |
| TikTok (TikTok Lite) | 22 April 2024 | Failure to conduct and provide a risk assessment for the new “Task and Reward Program” before its launch[28] |
| Meta (Facebook & Instagram) | 30 April 2024 | Deceptive advertising, disinformation, transparency of political content, lack of effective real-time election monitoring tools, inadequate notice-and-action mechanism[29] |
| Meta (Facebook & Instagram) | 16 May 2024 | Protection of minors, specifically concerning addictive design (“rabbit-hole” effects), age-assurance systems, and privacy settings[30] |
| Temu | 31 October 2024 | Compliance with obligations to limit the sale of non-compliant/illegal products, addictive design, recommender systems, and data access for researchers[31] |
| TikTok | 17 December 2024 | Failure to properly assess and mitigate systemic risks linked to election integrity, notable in the context of the 2024 Romanian presidential elections[32] |
Thematic Focus of EC Investigations
The EC’s enforcement actions have focused on the systemic risks posed by VLOPs, such as the dissemination of illegal content and election interference. The proceedings initiated in 2024 reveal a clear set of regulatory priorities:
- Integrity of Electoral Processes: The investigation into Meta, similar to a 2023 investigation into X, directly addressed the platform’s capacity to mitigate risks of disinformation and manipulation during elections, a key concern in a year featuring several national elections and the European Parliament elections. Another prominent example is the EC’s investigation into TikTok’s role in Russian interference in the 2024 Romanian presidential elections, which was opened in December 2024.
- Protection of Minors: Multiple proceedings, particularly against TikTok and Meta, scrutinised platform designs and safety measures related to younger users. This included concerns about addictive algorithms, the effectiveness of age verification, and exposure to inappropriate content.
- Addictive Design and ‘Dark Patterns’: The EC has also scrutinised how platform interfaces can manipulate user behaviour. The investigation into TikTok Lite’s reward system and the preliminary findings against X regarding potentially deceptive interfaces are prime examples of this focus.
- Transparency and Accountability: A consistent thread across nearly all investigations is the demand for greater transparency, whether in relation to advertising repositories (X, AliExpress), recommender systems (AliExpress, Temu), or the fundamental obligation to provide researchers with access to data (TikTok, Temu).
Looking Ahead
As the DSA moves into its second year, and based on the findings of this analysis, here is what we can expect for 2025:
- Closing the Implementation Gap: 2025 should hopefully see all Member States fully enacting the national legislation required to empower their DSCs. The enforcement vacuums that existed in 2024 undermine the integrity of the single market and must be closed.
- Resourcing the Bottleneck: The structural concentration of complaints in Ireland is a predictable and durable feature of the DSA’s design. Ensuring that CnM is adequately resourced to handle this pan-European workload is not just a national issue but a matter of systemic importance for the entire Union.
- Activating Stalled Mechanisms: The full potential of the DSA cannot be realised while key tools remain on the shelf. The EC’s adoption of the delegated act for vetted researcher data access will help unlock one of the regulation’s most important transparency and accountability functions. Close attention should also be paid to the rising number of Trusted Flaggers and observing how they fulfil their function.
- Improving Reporting Pipelines: The European Board for Digital Services and national authorities must work to resolve the procedural hurdles that are preventing the comprehensive reporting of judicial and administrative orders. A functioning system under Articles 9 and 10 is essential for understanding the full scope of DSA-related enforcement across the EU.
[1] For an up-to-date list of the intermediary services designated as VLOPs/VLOSEs, see: https://digital-strategy.ec.europa.eu/en/policies/list-designated-vlops-and-vloses.
[2] 17 of the reports are listed on the European Commission’s website: https://digital-strategy.ec.europa.eu/en/policies/dsa-dscs#1720699867912-1. Additionally, see the reports from France: https://www.arcom.fr/en/find-out-more/studies-and-data/studies-reviews-and-reports/arcom-2024-annual-report; Germany: https://www.dsc.bund.de/DSC/DE/Aktuelles/Downloads/DSC_Bericht2024.pdf?__blob=publicationFile&v=6; Hungary: https://english.nmhh.hu/article/254348/Annual_activity_report_of_the_Hungarian_digital_service_coordinator; Malta: https://www.mca.org.mt/articles/dsc-annual-report-2024; Romania: https://www.ancom.ro/en/uploads/links_files/DSA_annual_report_2024_en.pdf; Slovenia: https://www.akos-rs.si/en/digital-services/explore/news/news/annual-report-on-activities-under-the-digital-services-act-for-2024; and Sweden: https://pts.se/internet-och-telefoni/dsa-forordningen—regler-om-digitala-tjanster-for-en-sakrare-onlinemiljo/verksamhetsrapport-2024-annual-activity-report–pts-och-behoriga-myndigheters-verksamhet-inom-ramen-for-forordningen-om-digitala-tjanster-dsa-pts-er-202515/.
[3] The Cypriot, Estonian, and Polish authorities do not appear to have issued annual reports for 2024.
[4] The DSA began applying to all platforms on 17 February 2024. However, the first designated VLOPs/VLOSEs were required to comply with the regulation already in August 2023.
[5] Under Article 49, Member States can designate multiple authorities to enforce the DSA (known as ‘competent authorities’). Member states must, however, designate one of these competent authorities as a DSC. The DSC is responsible for ensuring DSA enforcement coordination at the national level, while the other competent authorities may be assigned certain tasks or sectors.
[6] https://www.gesetze-im-internet.de/ddg/BJNR0950B0024.html.
[7] DSC annual activity Report – Bulgaria, p. 2 https://crc.bg/files/%D0%92%D0%9E%D0%9F/annual_activity_report_DSC-BG_art-55-DSA.pdf.
[8] DSC annual activity report – Czech republic, p. 4 https://ctu.gov.cz/sites/default/files/obsah/stranky/522428/soubory/annual_activity_reports_dscs_en.pdf.
[9] DSC annual activity report – Netherlands, https://www.acm.nl/system/files/documents/dsa-jaarverslag-2024.pdf.
[10] DSC annual activity report – Spain, p. 2, https://www.cnmc.es/sites/default/files/editor_contenidos/CNMC/Memorias/Memoria_CNMC_Ingles_2024.pdf.
[11] The Commission calls on 6 member states to comply with the EU Digital Services Act, https://digital-strategy.ec.europa.eu/en/news/commission-calls-6-member-states-comply-eu-digital-services-act.
[12] Commission decides to refer Czechia, Spain, Cyprus, Poland and Portugal to the Court of Justice of the European Union due to lack of effective implementation of the Digital Services Act: https://digital-strategy.ec.europa.eu/en/news/commission-decides-refer-czechia-spain-cyprus-poland-and-portugal-court-justice-european-union-due.
[13] DSC annual activity report – Bulgaria, https://crc.bg/files/%D0%92%D0%9E%D0%9F/annual_activity_report_DSC-BG_art-55-DSA.pdf; DSC annual activity Report – Croatia, https://www.hakom.hr/UserDocsImages/2025/dokumenti/DSC%20CROATIA%20Annual%20Activity%20Report%20-%20Article%2055%20DSA.pdf?vel=956225.
[14] DSC annual activity report – Romania, ancom.ro/en/uploads/links_files/DSA_annual_report_2024_en.pdf. DSC annual activity report – France, p. 71-73, https://www.arcom.fr/en/find-out-more/studies-and-data/studies-reviews-and-reports/arcom-2024-annual-report. Note that the French DSC report consists of a few pages in the relevant authority’s general annual report and is, overall, very light in detail.
[15] DSC annual activity report – Greece, https://www.eett.gr/wp-content/uploads/2025/06/%CE%95%CF%84%CE%AE%CF%83%CE%B9%CE%B1-%CE%88%CE%BA%CE%B8%CE%B5%CF%83%CE%B7-DSA-2024.pdf.
[16] DSC annual activity report – Austria, p. 10, https://www.rtr.at/medien/aktuelles/publikationen/Publikationen/Publikationen_2025/DSC_Report_2024-E.pdf.
[17] DSC annual activity report – Germany, p. 9, https://www.dsc.bund.de/DSC/DE/Aktuelles/Downloads/DSC_Bericht2024.pdf?__blob=publicationFile&v=4#page=1.00&gsr=0.
[18] Out-of-court dispute settlement bodies under the Digital Services Act (DSA), https://digital-strategy.ec.europa.eu/en/policies/dsa-out-court-dispute-settlement.
[19] Appeals Centre Europe – Transparency reports: https://www.appealscentre.eu/transparency-reports/.
[20] Ibid, p. 13.
[21] Trusted flaggers under the Digital Services Act (DSA) – EC, https://digital-strategy.ec.europa.eu/en/policies/trusted-flaggers-under-dsa.
[22] Article 40(4) DSA.
[23] Article 40(8) DSA.
[24] Delegated act on data access under the Digital Services Act (DSA), Delegated act on data access under the Digital Services Act (DSA) | Shaping Europe’s digital future
[25] DSA Data Access Portal – EC, https://data-access.dsa.ec.europa.eu/home
[26] Commission opens formal proceedings against TikTok under the Digital Services Act – EC, https://ec.europa.eu/commission/presscorner/detail/en/ip_24_926.
[27] Commission opens formal proceedings against AliExpress under the Digital Services Act – EC, https://digital-strategy.ec.europa.eu/en/news/commission-opens-formal-proceedings-against-aliexpress-under-digital-services-act.
[28] Commission opens proceedings against TikTok under the DSA regarding the launch of TikTok Lite in France and Spain – EC, https://ec.europa.eu/commission/presscorner/detail/en/ip_24_2227.
[29] Commission opens formal proceedings against Facebook and Instagram under the Digital Services Act – EC, https://ec.europa.eu/commission/presscorner/detail/en/ip_24_2373.
[30] Commission opens formal proceedings against Meta under the Digital Services Act related to the protection of minors on Facebook and Instagram – EC, https://ec.europa.eu/commission/presscorner/detail/en/ip_24_2664.
[31] Commission opens formal proceedings against Temu under the Digital Services Act – EC, https://digital-strategy.ec.europa.eu/en/news/commission-opens-formal-proceedings-against-temu-under-digital-services-act.
[32] Commission opens formal proceedings against TikTok on election risks under the Digital Services Act – EC, https://ec.europa.eu/commission/presscorner/detail/en/ip_24_6487
One thought on “The Dawn of DSA Enforcement: Lessons from the Digital Services Coordinators’ First Annual Reports ”