Pseudonymized data after EDPS v SRB
On 4 September 2025, the Court of Justice of the European Union (CJEU) delivered a judgment in Case T-413/23P, European Data Protection Supervisor (EDPS) v Single Resolution Board (SBR) setting aside the General Court’s 2023 judgment.
The CJEU held that (i) comments submitted by affected shareholders and creditors in the SRB’s “right to be heard” process 'related to' individuals because they reflected the authors’ personal opinions, and (ii) transparency obligations to data subjects regarding identification of data recipients are to be assessed from the controller’s viewpoint at the time of collection. At the same time, the CJEU confirmed a contextual, risk-based approach: pseudonymization can, in certain circumstances, make data non-personal for a recipient who cannot reidentify individuals taking into account 'all the means reasonably likely to be used'.
What happened?
Following the 2017 resolution of Banco Popular, the SRB collected and then pseudonymized participants’ comments before sharing them with Deloitte for analysis. Only the SRB held the key alphanumeric codes linked to identities. Several participants complained to the EDPS that the SRB’s privacy statement had not identified Deloitte as a recipient of personal data. The EDPS found that the SRB did not comply with Regulation (EU) 2018/1725, which sets the legal framework for the protection of personal data processed by European Union institutions, bodies, offices and agencies (EUI GDPR). The General Court later annulled that decision in 2023, holding that it is necessary to consider the data recipient's perspective (and the means it had to identify the data subject) in order to assess whether pseudonymized data constitutes personal data (see our summary: When does pseudonymized data stop being personal data?). On appeal, the CJEU has now set aside the General Court’s ruling and referred the case back to the General Court for a decision on the SRB's second plea that the EDPS had infringed the right to good administration.
Although the case concerned the EUI GDPR, the definition of personal data and the relevant transparency obligations are identical to those in the General Data Protection Regulations 2016/679 (GDPR). Consequently, the findings of the CJEU will also be relevant to interpretation of the GDPR.
What the CJEU decided
The CJEU’s judgment focuses on controller-side transparency while retaining the recipient-specific analysis of identifiability.
- Transparency is assessed at collection from the controller’s side. The duty to inform data subjects about personal data recipients arises at collection and is assessed from the controller’s perspective.
The CJEU's reasoning for this is that the obligation to provide such information is part of the legal relationship between the data subject and the controller, and the obligation generally has to be fulfilled at the time of personal data collection (and, in this case, was also relevant to whether valid consent to processing was obtained). The CJEU therefore considered that the question of whether the controller has met its transparency obligations cannot depend on later assessments relating to the identifiability of subsequent data transfers for particular recipients.
- Where comments reflect the author’s views, they are by their nature information that 'relates to' that individual, without a need to examine the purpose or effect of that information. Where it is common ground that information expresses a person’s opinion or view, a data protection authority need not perform a detailed analysis of the purpose and effect of the information to show that it “relates” to that author.
This is relevant because 'personal data' is ‘any information relating to an identified or identifiable natural person’, and previous case law has set out that information relates to an identified or identifiable natural person where, by reason of its content, purpose or effect, it is linked to an identifiable person. The CJEU highlighted that the use of the word 'or' shows that these are alternative criteria. In this case, the nature of the content (being an expression of a person’s thinking, and therefore necessarily closely linked to that person) made it unnecessary for the EDPS to also analyse the purpose and effect of the data.
- Pseudonymized data are not necessarily personal data for every recipient. In line with previous case law, the CJEU held that pseudonymized data need not be personal data “in all cases and for every person”. Where effective measures prevent the data subject from being identifiable by means of the pseudonymized data and the recipient is not in a position to reidentify data subjects through “means reasonably likely to be used”, the data subject is not identifiable for that data recipient. The CJEU stated that the concept of 'personal data' is not unlimited – it requires that the data subject be identified or identifiable. Here, the CJEU agreed with the Opinion of the Advocate General that data protection obligations requiring a data subject to be identified cannot be imposed on an entity that is not in a position to carry out that identification.
- Risk evolves. Reiterating the “means reasonably likely” test, the CJEU notes that identification is not reasonably likely where the risk is “insignificant” (e.g., legally barred or requiring “disproportionate effort”), yet data may become personal when made available to parties who do have such means.
How this fits with previous case law and guidance
Nowak (2017) – exam scripts and examiners’ comments can be personal data because they communicate opinions about, and of, identifiable persons. The Court leveraged this logic in EDPS v SRB when confirming that subjective information, such as opinions and assessments, can constitute personal data provided it 'relates' to the person in question by reason of its content, purpose or effect. The CJEU found in EDPS v SRB that it is not necessary to also conduct an analysis of 'purpose or effect' to determine whether data 'relates to' an individual if the nature of their contents is personal opinions or views because these, as an expression of a person’s thinking, are necessarily closely linked to that person.
Breyer (2016) – dynamic IP addresses may be personal data where a controller has legal means to obtain additional information from another party; identifiability depends on “means reasonably likely” to be used in context. In the Breyer case, the CJEU found that if available means require a disproportionate effort in terms of time, costs and manpower, or if the identification of the data subject is prohibited by law, the risk of identification is insignificant and the pseudonymized data shall not be considered 'personal data'.
IAB Europe (2024) – a TC String (a combination of letters and characters which encodes and records user preferences when users visit a website or app) can be personal data when it can reasonably be associated with an identifier; the Court rejected a formalistic view that all identifying information must be in one party’s hands.
OC v Commission (2024) – identification means are not reasonably likely where the risk is insignificant because identification is prohibited by law or requires disproportionate effort. The EDPS v SRB judgment cites this when framing reidentification risk.
Guidance – The EDPB’s Guidelines 01/2025 on Pseudonymization[RF5] (adopted for consultation in January 2025) clarify the role of pseudonymization under the GDPR. They confirm that pseudonymized data remain personal data if they can be linked back to an individual by the controller or another party with access to additional information. At the same time, pseudonymisation can reduce risks, support reliance on legitimate interests (Article 6(1)(f)), help demonstrate compatibility of further processing (Article 6(4)) and strengthen compliance with the principles in Article 5, data protection by design (Article 25), and security obligations (Article 32).
Why this still does not settle many 'pseudonymization vs anonymization' questions
This decision provides clarification regarding how to consider certain controller obligations in connection with disclosures of pseudonymised data, but only modestly assists with interpreting the 'identifiability' of data subjects by third-party data recipients. Organisations should note that:
- The key test remains context-specific. Whether a natural person is identifiable by “means reasonably likely” to be used by the controller or other person (e.g., data recipient) is the fulcrum of whether data is personal data, but the CJEU offers no specific or quantitative thresholds. To ascertain whether means are reasonably likely to be used to identify a natural person, account should be taken of "all objective factors, such as the costs of and the amount of time required for identification, taking into consideration the available technology at the time of the processing and technological developments". Therefore, organisations must carefully assess such factors and revisit them as capabilities evolve.
- Recipient-specific status is hard to manage at scale. Organisations may wish to review their privacy notices in relation to disclosures of pseudonymized data. However, this analysis can be challenging. A dataset may be non-personal for one recipient and personal for another. Complex vendor chains, data clean rooms, and model providers, complicate who has “means” today and tomorrow. The judgment did not touch on any potential impact of controls against reidentification in third-party agreements.
- Transparency remains demanding. In many cases, controllers will have obligations related to identifying categories of data recipients (and, in some cases, identify specific recipients) at the collection stage, even if data will be pseudonymized before sharing. Enumerating realistic recipient categories without over- or under-inclusion is challenging, particularly in extended supply chains and dynamic AI ecosystems.
The judgment is one small step toward a clearer interpretation of EU data protection rules, rejecting broad-brush dogma such as “pseudonymized data are always personal data”. However, it does not deliver an anonymization test that organisations can apply.
Bottom line
The CJEU continues to interpret “personal data” broadly (Nowak, Breyer, IAB Europe), but EDPS v SRB shows a degree of willingness to consider the impact of practical reidentification barriers and recipient-specific capabilities.
Increasingly, organisations, industry bodies and other stakeholders are voicing a need for application of rules in a way that meaningfully protects data subjects while enabling responsible AI development and data-driven research by not blocking innovation where there is no realistic risk or harm. The next step in this direction would be more practical guidance: operational criteria for anonymisation, practical examples of what would and would not constitute “means reasonably likely to be used” to identify a data subject in different circumstances, consideration of whether contractual measures can be used to mitigate this risk with certain third parties, and worked examples that show when effective controls reduce residual identifiability risk to a sufficiently low level.