CaraComp
Log inStart Free Trial
CaraComp
Forensic-Grade AI Face Recognition for:
Start Free Trial
privacy

A Face Without a Name Is Still Personal Data (GDPR)

Why a Face Without a Name Is Still Personal Data Under GDPR

Here's a belief that circulates constantly among investigators, compliance teams, and anyone who's had to sit through a GDPR briefing: "If we blur the face or strip the name, it's no longer personal data." It sounds reasonable. It has the ring of practical common sense. And it is, almost entirely, wrong.

TL;DR

A face image without a name is almost never truly anonymous under GDPR — it's pseudonymous at best, which means it still attracts full data protection obligations, especially when facial comparison technology can re-link it to a real person.

Recent EU court decisions have been quietly, methodically dismantling this assumption. The logic isn't complicated once you see it clearly — but the implications are significant enough that getting it wrong could expose an investigation, a dataset, or an entire workflow to serious legal challenge. So let's actually understand what the law says, why it says it, and what that means in practice for anyone working with face images professionally.


The Law Doesn't Ask "Is the Name There?" It Asks "Could Someone Find It?"

This is the conceptual pivot point. Everything else follows from it.

Under GDPR, data becomes "personal data" whenever any party — not just the person holding the file — has reasonable means to link it back to a living individual. The controller doesn't need to intend re-identification. They only need to be unable to rule it out with confidence. That's a very different threshold than most people assume, and it's one that face images almost always fail to clear.

Think about it this way: imagine a fingerprint database where every name field has been deleted. Nobody in forensics would argue that database is now outside the law. A fingerprint is a biological identifier — its re-linkage potential doesn't disappear because you removed a label. A face works exactly the same way. The geometry of someone's facial structure, the spacing of their features, the measurements that make one person visually distinct from seven billion others — that is the identifying information. The name was always secondary.

"The Court confirmed that whether data is 'personal data' depends on whether the recipient of the data has the legal means and practical possibility to identify the data subject." Skadden, Arps, Slate, Meagher & Flom LLP, analysis of the EU Court of Justice clarification on pseudonymised data

That phrase — "legal means and practical possibility" — is doing enormous work. It's asking not whether re-identification is certain, but whether it's reasonably achievable. In a world where facial comparison technology is widely available and improving constantly, the answer for face images is almost always yes.


Three Categories, One Critical Distinction

Most practitioners treat "anonymised" and "de-identified" as interchangeable. They aren't, and this is where the legal exposure actually lives. This article is part of a series — start with Stress Test Facial Comparison Method Against Deepf.

Here's the spectrum as GDPR actually defines it:

De-identified data has had obvious labels removed — names, ID numbers, email addresses. The underlying data is unchanged. This is what most people do when they think they're "anonymising" something. It offers essentially no GDPR protection on its own, because the re-identification risk is often trivially easy to address with comparison tools.

Pseudonymised data, defined precisely under GDPR Article 4(5), means data processed so it cannot be attributed to a specific person without use of additional information — and that additional information must be kept separately, under strict technical safeguards. Pseudonymisation is explicitly recognised as a risk-reduction measure, but — and this is the part people miss — it doesn't take data outside GDPR's scope. It just earns you some credit for good practice. You're still processing personal data.

Anonymised data is the only category that exits GDPR entirely. True anonymisation means re-identification is irreversible, not just inconvenient. Regulators have consistently found that face images almost never qualify, for the simple reason that facial comparison makes re-linkage a reasonable means rather than a theoretical one. The bar for genuine anonymisation is extraordinarily high — and a face photo, by its nature, is built to be recognised.

Why This Distinction Matters for Investigators

  • De-identification ≠ anonymisation — Removing names from image files does not reduce GDPR obligations; the face itself retains full re-identification potential via comparison.
  • 📊 Processing purpose determines classification — The moment a face image is processed specifically to identify someone, it crosses into Article 9 special category biometric data, attracting the highest protection tier regardless of what labels were stripped.
  • 🔮 Methodology creates defensibility — Documented, proportionate, professionally grounded comparison methodology is precisely what separates legally defensible casework from exposure. How you process matters as much as what you store.

Trusted by Investigators Worldwide
Run Forensic-Grade Comparisons in Seconds
Full platform access for 7 days. Run real searches — no credit card, no commitment.
Run My First Search →

The Article 9 Trap Most People Don't See Coming

Here's where it gets particularly interesting — and where many well-intentioned practitioners walk straight into a problem they didn't know existed. Previously in this series: Ai Face Match Not Probable Cause Grandmother Wrong.

Face images aren't automatically classified as Article 9 "special category" biometric data. A photograph of someone's face stored as a general record — a staff headshot, say — is regular personal data under Article 6. That changes the moment the purpose of processing shifts to unique identification. And facial comparison, almost by definition, is processed for exactly that purpose.

So the same image file can occupy two completely different legal categories depending on what you do with it. A face photo in a general archive: Article 6 personal data. That same image run through a comparison workflow to identify who someone is: Article 9 special category biometric data, requiring explicit legal basis, stringent safeguards, and in many cases a Data Protection Impact Assessment. The file didn't change. The processing purpose did. (This is the kind of distinction that makes data protection lawyers genuinely useful to have around.)

For investigators using facial comparison tools professionally — whether for asset tracing, due diligence, or case verification — understanding how facial recognition biometrics are legally classified during active processing is foundational to building a defensible methodology. The technology question and the legal question aren't separate tracks. They're the same conversation.

Article 4(5)
GDPR's precise legal definition of pseudonymisation — which explicitly confirms that pseudonymised data remains personal data and stays within the regulation's full scope
Source: EU General Data Protection Regulation

The Myth That Keeps Getting Investigators Into Trouble

"As long as I don't store names, I'm outside privacy law." Up next: Gdpr Facial Comparison Vs Biometric Mass Collectio.

This belief is understandable. It comes from a reasonable instinct — remove the identifying label, remove the risk. The problem is that GDPR wasn't written for an era of index cards. It was written for an era of comparison algorithms, vector embeddings, and databases that can match a face across millions of records in seconds.

The EU's approach, reinforced by the Court of Justice, is explicitly future-aware in this respect. The "reasonable means" test is designed to account for the capabilities that actually exist in the world — not just what the current data holder possesses, but what any reasonably resourced party might access. Facial comparison technology clears that bar comfortably. Regulators know it. Courts are now saying it explicitly.

What this means practically: an investigator who stores unlabelled face images but uses them in a comparison workflow has not escaped GDPR by omitting the name column. The workflow itself — the act of processing for identification — is where the legal exposure sits. And "I didn't store the names" is not a defence that will satisfy a data protection authority examining how the comparison was conducted, documented, and justified.

Look, nobody's saying this is simple. Investigators have legitimate purposes for using facial comparison — proportionate, professionally grounded work that serves real accountability functions. GDPR isn't designed to make that impossible. But it does require that you know what category of data you're handling, that you have a lawful basis proportionate to that category, and that your methodology can be explained and defended if challenged. That's not an unreasonable ask. It's just not the same as "remove the name and carry on."

Key Takeaway

A face image without a name is pseudonymous, not anonymous — and under GDPR, pseudonymous data carries full legal obligations. The re-identification capacity of a face, not the presence of a label, is what keeps it firmly inside data protection law. How you process and link facial data determines your exposure, not what you chose to delete before you started.


So here's the question worth sitting with: the next time someone tells you that stripping the name field from an image dataset solves the GDPR problem, ask them this — if I can take that nameless photo and match it to a person using comparison software, at what point exactly did it stop being about that person? The law already knows the answer. The face was always the data.

Have you ever been told that "anonymising" or blurring faces makes GDPR a non-issue for investigation work — and did you believe it at the time? The answer probably says a lot about how that guidance was explained.

Ready to try AI-powered facial recognition?

Match faces in seconds with CaraComp. Free 7-day trial.

Start Free Trial