EDPB identifies challenges hindering the full implementation of the right to erasure - but does it really?
- Feb 21
- 14 min read
Background
The EDPB's Coordinated Enforcement Framework (CEF) 2025 set out to test how organisations across the EEA handle the right to erasure under Article 17 GDPR. It launched in March 2025, ran through national actions across the year, and culminated in an EDPB report adopted on 10 February 2026. The Board describes it as a coordinated attempt to compare real-world request handling, identify best practices, and spot recurring compliance failures. This was the fourth CEF action, following previous exercises on cloud services by public bodies, data protection officers, and the right of access.
Methodology and its limits
Thirty-two authorities across the EEA participated. Of 7,943 controllers contacted, only 764 responded. The Annex reproduces the national reports of 25 authorities. Germany is represented by a joint report from seven Länder-level and federal authorities. Austria provided figures only and does not appear in the Annex.
The sample is not representative. Each authority applied its own selection criteria – targeting specific sectors, sizes, or risk profiles. Some focused exclusively on the public sector (the EDPS on EU institutions, Austria on public administration). Others targeted a single industry (Denmark: online casinos; Estonia and Greece: retail loyalty programmes). About half took a cross-sectoral approach. Four of the five authorities that contacted the most controllers achieved response rates of 0.6% to 14%. Authorities conducting formal investigations had 100% response rates; mass-survey authorities had under 10%. Averaging findings across these structurally incomparable populations produces figures that mean very little.
The questionnaire was not applied uniformly. Individual authorities omitted questions, changed wording, or added follow-up questions. The survey was mandatory in enforcement contexts and voluntary elsewhere. Translations across twenty languages introduced further variation. Some authorities reviewed supporting documentation; others relied solely on self-report.
Of the 764 responses, 431 came from the public sector and 325 from the private sector. Within the private sector, large enterprises (over 250 employees) represented 54% of respondents, medium enterprises 22%, small enterprises 16%, and micro-enterprises 8%. About 21% of controllers processed data of more than one million individuals. Roughly 42% processed personal data of children, and a similar proportion processed data of vulnerable subjects.
What the data shows
Most controllers had received zero erasure requests in the years surveyed. About 70% received fewer than ten per year. A small cohort received more than 500. Rejection rates were polarised: controllers either rejected almost no requests, or rejected more than 50%. Authorities conducting formal investigations found that organisations systematically refusing requests were generally able to invoke valid Article 17(3) exceptions. Customers and potential customers submitted the most requests. Applicants for public services, citizens dealing with public entities, and employees were least likely to exercise this right.
Almost two thirds of participating authorities assessed the overall compliance of surveyed controllers as "average." No authority reported compliance as "very high" or "very low." Larger organisations tended to do better: they receive more requests, have dedicated teams, and run more structured internal processes. Private-sector controllers were generally better placed than public-sector equivalents. Small and micro-entities, many of which had received no requests during the observation period, showed the lowest procedural maturity.
The seven recurring issues
Issue 1 – No documented internal procedure
Seventeen authorities identified this as a primary concern. The GDPR does not mandate a specific format, but the accountability principle under Articles 5(2) and 24 GDPR requires controllers to demonstrate compliance – which in practice requires documented processes. Without them, requests are handled inconsistently and subjectively. This risk is acute for the right to erasure: the assessment of whether Article 17(1)(a) applies (data no longer necessary for its purpose) is genuinely fact-intensive, and controllers need a shared framework for that assessment.
A specific confusion appeared repeatedly: some controllers treated account closure as equivalent to erasure. Offering a user the ability to delete their account while retaining the underlying personal data in internal systems does not satisfy Article 17. Account closure is a service-level operation. Erasure is the destruction of the data itself. Many controllers also excluded backup data from the scope of erasure requests by default, without legal justification (see Issue 6).
Best practices observed: automated software generating deletion records; KPI monitoring reported to senior management; dedicated cross-departmental teams for request handling.
Issue 2 – Inadequate staff training
Multiple authorities found that training on Article 17 GDPR was absent, too infrequent, or limited to generic annual sessions. Staff lacked knowledge of internal procedures, causing delays in recognising and routing requests, missed deadlines, errors in applying exceptions, and incomplete erasure. Inadequate training correlates directly with higher complaint volumes: data subjects who receive no response or an inadequate response are more likely to contact an authority. Best practices included post-session assessments, simulated erasure requests to test staff, and e-learning supplements.
Issue 3 – Insufficient information to data subjects
Thirteen authorities flagged deficiencies in what controllers tell data subjects about the right to erasure. There is a feedback loop: less information leads to fewer requests, which leads to less investment in transparency. Specific failures included: not mentioning the right to erasure in privacy notices (required by Articles 13(2)(b) and 14(2)(c) GDPR); not explaining which conditions must be met; not specifying how to submit a request; not acknowledging receipt or notifying data subjects of extensions; not providing reasoned refusals; and not informing data subjects of their right to complain to an authority under Article 12(4) GDPR. The forthcoming 2026 CEF on transparency (Articles 12, 13 and 14 GDPR) will examine these obligations in detail.
Issue 4 – Misuse of exceptions
More than a dozen authorities identified problems with the application of Article 17(3) exceptions. Two distinct failure modes: (i) blanket application – treating exceptions as automatically applicable to whole categories of data without an individual assessment; and (ii) failure to apply interim safeguards when erasure is contested or delayed, such as restriction of processing under Article 18 GDPR.
Three specific patterns emerged. On Article 17(3)(a) (freedom of expression), online media and publishers refused to remove personal data by citing freedom of expression without any documented balancing assessment. On Article 17(3)(b) (legal retention obligation), some controllers were unaware of the specific legal obligations applicable to their sector, leading to data being retained beyond what any law actually requires. On Article 17(3)(e) (legal claims), erasure was sometimes refused on grounds of legitimate interest – the wrong exception – without proper balancing. Some controllers could not even provide accurate rejection rate figures.
Issue 5 – Retention periods undefined or wrongly applied
A number of authorities – particularly those dealing with smaller organisations – found that controllers cannot accurately determine appropriate retention periods. The challenge is real: national and EU-level legal obligations impose retention requirements expressed in different formats and triggered by different events (contract, tax, archiving, limitation periods). The report observed controllers applying the longest period applicable to any one processing activity across all activities – a manifestly incorrect approach. Without knowing how long data must be kept, a controller cannot determine when it must be deleted, nor can it assess whether a legal retention obligation under Article 17(3)(b) actually blocks erasure. This issue also affects transparency: Articles 13(2)(a) and 14(2)(a) GDPR require disclosure of retention periods, but the information provided is often absent or vague. Best practices included retention schedules cross-referencing data categories, purposes, and legal obligations, and data deletion matrices linking data type to legal basis and retention period.
Issue 6 – Deletion from backups
Half of participating authorities raised concerns about backup deletion. Backups create genuine structural tension with erasure: their value as recovery tools depends on not being modified, while Article 17 GDPR requires erasure without undue delay regardless of where the data is held. The report found a wide spectrum of practices: some controllers had no procedures at all and relied on automatic overwrite cycles; others had no procedures but applied general retention-based backup deletion. One best practice stood out: a tool that, upon reaching a predetermined retention end date, extracted all personal data relating to a data subject from all systems, moved it to an access-restricted environment, and permanently deleted it one month later. Some controllers replaced data with strings of random characters, achieving functional erasure within the backup structure. The report calls for EDPB guidance on what "without undue delay" means in the backup context.
Issue 7 – Anonymisation used in place of deletion
Recital 26 GDPR states that information that cannot be linked to an identifiable person falls outside the Regulation. True anonymisation therefore removes data from GDPR's scope entirely, and using it to respond to an erasure request is legitimate. The report acknowledges this. The problem it identifies is different: many controllers claim to anonymise data when they are only pseudonymising it or applying partial masking – techniques that do not prevent re-identification and therefore do not satisfy Article 17. Multiple authorities found varying and often inadequate levels of technical implementation. Some large controllers used weaker masking than some mid-sized ones. The EDPB is developing new anonymisation guidelines following the CJEU's September 2025 ruling in Case C-413/23P (EDPS v. SRB).
The report cites ISO/IEC 27001 and ISO 9001 as best practices for anonymisation. The significance and accuracy of that citation is discussed in the critical analysis below.
Enforcement and guidance landscape
Several authorities report an upward trend in erasure complaints since GDPR came into force. In high-complaint jurisdictions, erasure represents around 15–19% of total complaints received (Netherlands, Luxembourg, Denmark). The Spanish authority estimates around 5% by mid-2025.
Enforcement examples include: the Finnish authority fined a parking enforcement company €75,000 for failure to delete data once no longer necessary (reduced on appeal to €70,000, upheld by the Supreme Administrative Court); the Dutch authority fined a recruitment company €6,000 for systematically ignoring removal requests. Multiple authorities issued compliance orders mandating deletion – in cases involving a vehicle insurer that retained data from a quotation-only contact, a video of a private altercation, and a forum post made by a user who was underage at the time. Nine authorities launched or continued formal investigations as part of the CEF, including Austria, Denmark, several German authorities, Portugal, France, Slovenia, Lithuania, and Cyprus.
The guidance landscape is extensive. Almost all participating authorities have published general or sector-specific guidance on erasure – from fact sheets and Q&As to web portals, podcasts, chatbots, board games for children, and interactive online tools. Highlights include the German DSK's standardised data protection model (Module 60 on Deletion and Destruction), the Greek authority's Wizard tool and byDefault SME project, the Spanish authority's 24/7 chatbot, and "Olivia" – a collaborative AI teaching tool built by the Croatian and Italian authorities with several universities.
What follows
The EDPB signals intent to develop further actionable guidance, with particular focus on anonymisation, exceptions, retention periods, and backup deletion. It will draw on the national-level guidance catalogued in the CEF. Seventeen authorities and several German authorities plan additional national awareness activity – online guidance, webinars, conferences, and sector-specific materials (including mobile applications). Some authorities are actively considering formal investigations against specific controllers identified through the CEF. A small number plan to act against controllers that failed to respond to the questionnaire at all.
What's wrong with this Report
A. The anonymisation–deletion confusion
This is the report's most significant substantive error, and it runs through Issue 7 and into Issues 4 and 6. The report describes anonymisation as a "substitute for permanent deletion," frames it as a compliance risk, and calls for guidance to help controllers ensure data "can no longer be linked to an identifiable individual." This framing is partially correct – but it rests on a category confusion the report never resolves.
Recital 26 GDPR, which the report itself quotes, states that information that does not relate to an identifiable person falls outside the Regulation. That is not a narrow exception – it is a definitional boundary. Once data is genuinely and irreversibly anonymised to the Recital 26 standard, it ceases to be personal data, the GDPR ceases to apply, and the Article 17 obligation is discharged. True anonymisation is not a "substitute" for deletion in any pejorative sense – it achieves the same legal result.
The real problem – which the report identifies only at a technical level and never clearly separates conceptually – is that what many controllers call "anonymisation" is actually pseudonymisation (replacing identifiers with codes while retaining a key) or partial masking (removing some fields while leaving the record re-identifiable). Neither satisfies Article 17. This is a problem of implementation, not of principle. The report never makes this distinction explicit, and that silence matters: controllers who have genuinely anonymised data in compliance with Recital 26 receive the implicit message that their approach is a compliance risk, while controllers performing inadequate pseudonymisation are told they have the same problem. The conflation undermines clear compliance signalling.
The report compounds the confusion in Issue 4, where it recommends anonymisation as a useful technical safeguard when a controller needs more time to comply with an erasure request – while completely missing that that’s not just a “technical safeguard” but actual ending of processing.
The Austrian DSB has decided in December 2018 that Article 17 GDPR does not require physical destruction of data; because Article 4(2) distinguishes "erasure" from "destruction," and because Recital 26 takes anonymous information entirely outside the GDPR's scope, the authority concluded that thorough anonymisation satisfies an erasure request - the personal reference is gone, and so is the processing. [and before you say that this was too close after GDPR became applicable, let's remember that the definition of personal data and thus of anonymised data was the same under Directive 95/46/CE]
Moreover, the EDPB itself stated that "Anonymization of personal data is an alternative to deletion, provided that all the relevant contextual elements are taken into account and the likelihood and severity of the risk, including the risk of re-identification, are regularly assessed." (Guidelines 4/2019 on Article 25 Data Protection by Design and by Default).
Moreover, under Issue 7, the report lists as a best practice: "Implement technical standards such as ISO/IEC 27001 and ISO 9001 on management systems, as they help organisations improve their processes and accountability", in the context of addressing anonymisation difficulties. Neither standard is an anonymisation standard, and their citation here is misleading.
ISO/IEC 27001 is the international standard for information security management systems. It covers the governance of information security broadly. It contains no requirements regarding anonymisation techniques, de-identification methods, re-identification risk assessment, or what constitutes "anonymous" data. It is a governance framework, not a technical data-handling specification. ISO 9001 is a quality management standard covering operational processes and continual improvement. It has no connection to anonymisation, data deletion, or privacy engineering. Its citation in this context is a category error.
The standard the report could have cited (but does not) is ISO/IEC 20889:2018 (privacy-enhancing data de-identification terminology and classification of techniques). The WP29 Opinion 05/2014 on Anonymisation Techniques, though pre-GDPR, remains the most detailed regulatory framework for evaluating whether a technique satisfies the anonymisation threshold. None of these are referenced. A controller implementing ISO 27001 and ISO 9001 in full will not thereby know what anonymisation technique is appropriate, what the re-identification risk threshold is, or how to demonstrate compliance with Recital 26.
B. Methodological weaknesses
Non-representative sampling
The 764 responding controllers are not a random or statistically representative sample of EEA controllers. The aggregate dataset is a patchwork of purposive samples with different underlying populations, aggregated as though they were one study. Authorities with mandatory participation (formal investigations) had 100% response rates; mass-survey authorities had rates as low as 0.6%. Averaging findings across these structurally incomparable populations produces figures that are difficult to interpret and should not be treated as reliable indicators of EEA-wide compliance.
No standardised compliance rubric
The central finding – that roughly two thirds of authorities assessed compliance as "average" – is entirely qualitative. Each authority applied its own judgment. There is no defined threshold, no scoring matrix, no shared rubric. One authority's "high" may be another's "average." The aggregate figure has no meaningful statistical content. It is the average of subjective impressions formed by different evaluators applying different standards to different samples using different methodological approaches.
Self-selection bias
Where participation was voluntary, controllers with more mature data governance may have been more likely to respond. The report itself acknowledges that fear of regulatory scrutiny was a factor in low response rates – which implies the voluntary sample may over-represent the relatively well-governed end of the spectrum. If so, the compliance picture is likely worse than the data suggests, not better.
D. The internal contradiction in the enforcement findings
Authorities conducting formal investigations "noted that the rejection of requests by responding controllers was overall justified." This is the most probative finding in the report, and it sits in direct tension with the general narrative of systematic non-compliance – particularly on Issue 4 (misuse of exceptions). If the most rigorous form of investigation found refusals to be largely justified, the inference of widespread misuse is significantly weakened. The report does not acknowledge this tension. Formal investigation findings are noted parenthetically and do not meaningfully influence the overall compliance assessment. They should.
E. The proportionality gap
A recurring finding – across Issues 1, 2, and 5 – is that small and micro-enterprises lack documented internal procedures, regular training, and formal retention schedules. This is presented as a compliance problem. A more proportionate reading is that it is an entirely predictable feature of organisational scale. The report itself acknowledges that many of these controllers had received zero erasure requests over the entire observation period. It is not self-evident that a micro-enterprise with three employees, processing basic contact data, and having received no erasure requests in years, is failing Article 17 simply because it lacks a formal internal procedure.
The GDPR's accountability principle requires controllers to be able to demonstrate compliance – not bureaucratic formalism disproportionate to the processing activities and risks involved. Recital 13 explicitly recognises that the Regulation should be "proportionate to the risks" and should take into account the specific situation of small and medium-sized enterprises. The report's recommendations are largely uniform regardless of size and context. The same procedural infrastructure is implicitly prescribed for a large e-commerce platform and a two-person professional services firm. That is inconsistent with the GDPR's own design.
F. Circular recommendations
For virtually every issue identified, the primary recommended action for the EDPB and national authorities is some variation of "issue further guidance." Issue 1: further templates and guidance including flowcharts. Issue 4: further targeted guidance and clarification. Issue 5: practical guidance on retention periods. Issue 6: more guidance on erasure in backups. Issue 7: continuing to issue practical actionable guidance. The recommendation is the same regardless of the problem.
There is no analysis of whether the extensive existing guidance – which the report itself catalogues in Section 5 as substantial, multi-format, and widely available – is actually being used. There is no engagement with the possibility that the problem is guidance uptake rather than guidance volume. An organisation that lacks awareness of Article 17 does not become compliant because the EDPB publishes another set of guidelines it is equally unlikely to consult. The causal pathway from "publish more guidance" to "improved compliance" is assumed rather than demonstrated, and the recommendation pattern happens to serve the institutional interests of the bodies making it.
G. The backup issue: genuine legal uncertainty, not just poor procedure
The report treats backup deletion as a compliance failure caused primarily by controllers' inadequate procedures. What it does not engage with is the genuine legal uncertainty about the scope of the Article 17 obligation with respect to backup copies. The obligation to erase "without undue delay" is clear in its general application. What is not clear – and what the report effectively acknowledges by calling for guidance on what "without undue delay" means in the backup context – is precisely how that obligation applies to backups that may be technically impossible to modify without compromising their structural integrity.
Some national authorities have accepted that, where a controller maintains procedural controls to prevent a restored backup from reintroducing erased data into live systems, the obligation may be considered satisfied until the backup is naturally overwritten. That nuanced position is nowhere reflected in the report, which treats backup retention as a procedural failure rather than a genuine area of legal ambiguity. The recommended best practice – "follow established standards to erase and destroy data in a secure and structured manner" – is unhelpful: no relevant published standard for backup-specific erasure in the GDPR context is identified or cited.
H. The Annex reveals more than the main report acknowledges
The Annex runs to well over 300 pages of national reports and contains significant inconsistencies that the main report does not adequately disclose. The Bulgarian authority allowed anonymous responses. Its national report assesses compliance as "High" – but since controllers were not identified, that assessment rests on a self-selected, unattributable, unverifiable dataset. The main report aggregates this into the overall figures without flagging it.
The Cypriot authority did not contact new controllers in the CEF context. It contributed findings drawn from pre-existing investigations rather than from the common questionnaire. Its inclusion in aggregate figures is therefore not comparable to authorities that ran new surveys. Several authorities answered "N/A" to large portions of the questionnaire or provided responses that apply only to their specific procedural framework. The level of incompatibility between national reports is significant enough to cast doubt on the reliability of aggregated cross-jurisdictional figures in the main report – but this is not disclosed.
Instead of conclusion
Taken together, these deficiencies are not peripheral. A report that cannot define "average compliance," misclassifies a lawful privacy-enhancing technique as a risk, cites inapplicable standards as best practice, and contains an unacknowledged contradiction between its investigative and questionnaire findings does not provide a reliable empirical account of Article 17 compliance in Europe. It is best understood as a coordinated regulatory signal – a statement of enforcement intent rather than a measured assessment of where the compliance baseline actually sits. Practitioners should engage with it on those terms: as a prompt to review internal erasure procedures, not as authoritative evidence of systemic sector-wide failure.




Comments