Blurring Names Doesn't Anonymise Faces Under GDPR | Podcast
Blurring Names Doesn't Anonymise Faces Under GDPR | Podcast
This episode is based on our article:
Read the full article →Blurring Names Doesn't Anonymise Faces Under GDPR | Podcast
Full Episode Transcript
A European court just ruled that the agency which pseudonymises a dataset — swaps names for codes, b A 95 Confidence Score Falls Apart If The Media Waslurs identifiers — still bears full G. D. P.
R. obligations for that data. Not reduced
R. obligations for that data. Not reduced obligations.
Full ones. The blur doesn't shrink your legal exposure. It shifts the entire burden onto you.
If you work anywhere near facial images, case files, or biometric-evidence" >biometric-evidence" >biometric-evidence" >biometric evidence, this matters right now. An E. U.
court examined whether pseudonymised data — data where direct identifiers get replaced with codes — counts as personal data under G. D. P.
R. The answer depends on who's holding it. For the original controller, the organization that created the pseudonymised dataset, the answer is an unqualified yes.
They can still re-identify the individual
They can still re-identify the individual. hub article So the question threading through this whole ruling is deceptively simple — who has the key? The court introduced what's called the "reasonably likely" standard for re-identification.
It doesn't ask whether someone could theoretically reverse the pseudonymisation. It asks whether re-identification is realistically probable given the technical safeguards in place. And that assessment is context-dependent.
The same dataset can be personal data for one party and non-personal for another. An investigator who archives case images and stores the pseudonymisation key locally? G.
D. P. R.
applies in full. A consultant who receives those images without any access to the key? Potentially not.
Same file. Different legal classification.
Same file. Different legal classification. Entirely based on access to re-identification tools.
Now, a lot of practitioners assume encryption solves this. previous episode It doesn't. Encryption isn't anonymisation.
It's pseudonymisation at best. The whole point of encryption is reversibility — you need to get the original information back. So an encrypted case file with a facial image inside?
Still regulated. Still biometric data. Still personal data under G.
D. P. R.
The part that caught most legal analysts off guard
The part that caught most legal analysts off guard was the transparency obligation. The court said the original controller must inform data subjects about all foreseeable recipients of their data at the time of collection. Even if those recipients can't identify the person.
You can't dodge that duty by pseudonymising before you share. And the E. U.
's A. I. Act now layers additional compliance requirements on top of G.
D. P. R.
for any system using biometric data for identification or categorization. Facial images without names still fall squarely next episode into that high-risk category. Most people assume pseudonymisation reduces their obligations.
The Bottom Line
The court said the opposite. It creates new ones — documentation obligations. You must record why re-identification isn't reasonably likely for each recipient, and you must revisit that assessment as technology and datasets evolve.
It's not a one-time checkbox. So, plain and simple. Blurring a name on a facial image doesn't make it anonymous.
If you're the one who did the blurring, G. D. P.
R. treats you as if the name's still there. And you owe the person in that image a full accounting of everywhere their data might go.
The forward-looking reality is this — every pseudonymised biometric file sitting in your archive is a compliance obligation waiting for a regulator to ask about it. Document your controls now. Full breakdown's in the show notes.
Ready for forensic-grade facial comparison?
2 free comparisons with full forensic reports. Results in seconds.
Run My First SearchMore Episodes
EU's Age Check App Declared "Ready." Researchers Cracked It in 2 Minutes.
The European Commission declared its age verification app ready to roll out across the entire bloc. Security researchers broke through its core protections in about two minutes. Not two hours. Not tw
PodcastMeta's Smart Glasses Can ID Strangers in Seconds. 75 Groups Say Kill It Now.
A security researcher walked into the R.S.A.C. conference in twenty twenty-six wearing a pair of Meta Ray-Ban smart glasses. Within seconds, those glasses — paired with a commercial facial recognition system — identified
PodcastDiscord Leaked 70,000 IDs Answering One Simple Question: Are You 18?
Seventy thousand people uploaded photos of their government I.D.s to Discord. They weren't applying for a job or opening a bank account. They were just trying to prove they were eighteen. <break tim
