CaraComp
Log inStart Free Trial
CaraComp
Forensic-Grade AI Face Recognition for:
Start Free Trial
Podcast

Blurring Names Doesn't Anonymise Faces Under GDPR | Podcast

Blurring a Name Doesn't Anonymise a Face: What GDPR Actually Says

Blurring Names Doesn't Anonymise Faces Under GDPR | Podcast

0:00-0:00

This episode is based on our article:

Read the full article →

Blurring Names Doesn't Anonymise Faces Under GDPR | Podcast

Full Episode Transcript


A European court just ruled that the agency which pseudonymises a dataset — swaps names for codes, b A 95 Confidence Score Falls Apart If The Media Waslurs identifiers — still bears full G. D. P.


R. obligations for that data. Not reduced

R. obligations for that data. Not reduced obligations.

Full ones. The blur doesn't shrink your legal exposure. It shifts the entire burden onto you.

If you work anywhere near facial images, case files, or biometric-evidence" >biometric-evidence" >biometric-evidence" >biometric evidence, this matters right now. An E. U.

court examined whether pseudonymised data — data where direct identifiers get replaced with codes — counts as personal data under G. D. P.

R. The answer depends on who's holding it. For the original controller, the organization that created the pseudonymised dataset, the answer is an unqualified yes.


They can still re-identify the individual

They can still re-identify the individual. hub article So the question threading through this whole ruling is deceptively simple — who has the key? The court introduced what's called the "reasonably likely" standard for re-identification.

It doesn't ask whether someone could theoretically reverse the pseudonymisation. It asks whether re-identification is realistically probable given the technical safeguards in place. And that assessment is context-dependent.

The same dataset can be personal data for one party and non-personal for another. An investigator who archives case images and stores the pseudonymisation key locally? G.

D. P. R.

applies in full. A consultant who receives those images without any access to the key? Potentially not.


Trusted by Investigators Worldwide
Run Forensic-Grade Comparisons in Seconds
Full platform access for 7 days. Run real searches — no credit card, no commitment.
Run My First Search →

Same file. Different legal classification.

Same file. Different legal classification. Entirely based on access to re-identification tools.

Now, a lot of practitioners assume encryption solves this. previous episode It doesn't. Encryption isn't anonymisation.

It's pseudonymisation at best. The whole point of encryption is reversibility — you need to get the original information back. So an encrypted case file with a facial image inside?

Still regulated. Still biometric data. Still personal data under G.

D. P. R.


The part that caught most legal analysts off guard

The part that caught most legal analysts off guard was the transparency obligation. The court said the original controller must inform data subjects about all foreseeable recipients of their data at the time of collection. Even if those recipients can't identify the person.

You can't dodge that duty by pseudonymising before you share. And the E. U.

's A. I. Act now layers additional compliance requirements on top of G.

D. P. R.

for any system using biometric data for identification or categorization. Facial images without names still fall squarely next episode into that high-risk category. Most people assume pseudonymisation reduces their obligations.


The Bottom Line

The court said the opposite. It creates new ones — documentation obligations. You must record why re-identification isn't reasonably likely for each recipient, and you must revisit that assessment as technology and datasets evolve.

It's not a one-time checkbox. So, plain and simple. Blurring a name on a facial image doesn't make it anonymous.

If you're the one who did the blurring, G. D. P.

R. treats you as if the name's still there. And you owe the person in that image a full accounting of everywhere their data might go.

The forward-looking reality is this — every pseudonymised biometric file sitting in your archive is a compliance obligation waiting for a regulator to ask about it. Document your controls now. Full breakdown's in the show notes.

Ready to try AI-powered facial recognition?

Match faces in seconds with CaraComp. Free 7-day trial.

Start Free Trial