CaraComp
Log inStart Free Trial
CaraComp
Forensic-Grade AI Face Recognition for:
Start Free Trial
Podcast

"Verified" Doesn't Mean Matched: Why 5–6% of Passed Identity Checks Still Hide the Wrong Face | Podcast

"Verified" Doesn't Mean Matched: Why 5–6% of Passed Identity Checks Still Hide the Wrong Face

"Verified" Doesn't Mean Matched: Why 5–6% of Passed Identity Checks Still Hide the Wrong Face | Podcast

0:00-0:00

This episode is based on our article:

Read the full article →

"Verified" Doesn't Mean Matched: Why 5–6% of Passed Identity Checks Still Hide the Wrong Face | Podcast

Full Episode Transcript


A verified identity profile just cleared every automated check. The credential is cryptographically authentic, government-issued, untampered. And there's still roughly a one-in-twenty chance the person behind it is a fraud.


According to industry data from Veriff, around five

According to industry data from Veriff, around five to six percent of all identity verification sessions involve someone actively trying to pose as somebody else. These aren't failed sessions. These are sessions that passed. If you work in investigations, compliance, or fraud prevention, that number should change how you look at every "verified" badge on your screen. So why do verified credentials still let the wrong face through?

The European Commission recently published a use-case manual for its E.U.D.I. Wallet — a digital identity system that lets citizens prove they're above a certain age without revealing their full birthdate or other personal details. It uses something called selective disclosure. That means the wallet shares only the minimum claim needed — "yes, this person is over eighteen" — and keeps everything else locked. Cryptographically, it's solid. The credential itself is tamper-proof and properly issued. But that proof applies to the digital object, not the human holding the phone.

Most people assume "verified" means the system confirmed the person's face matches the I.D. photo. That assumption makes sense — the word "verified" sounds absolute. In reality, verification confirms the credential is real. It doesn't confirm the person presenting it is the person pictured in it. A fraudster holding a legitimate credential is still a fraudster.


Trusted by Investigators Worldwide
Run Forensic-Grade Comparisons in Seconds
Full platform access for 7 days. Run real searches — no credit card, no commitment.
Run My First Search →

Layer on the single-image problem

Now layer on the single-image problem. Most identity systems compare a live selfie against one reference photo — often taken five or even ten years earlier. Faces change. Lighting differs. One outdated photo simply isn't enough to generate a reliable likeness score. And who suffers most from that gap? According to research compiled by Patronscan, darker-skinned individuals and women experience significantly higher false match rates. The system returns equally high confidence scores for true matches and biased false positives. Without a manual comparison, you can't tell which is which.

What about age estimation tools specifically? According to N.I.S.T. testing, those tools often need the challenge age set between twenty-nine and thirty-three just to keep false positives low. So a system claiming to verify an eighteen-year-old might carry a margin of error of fifteen years or more. That margin is invisible to anyone who only sees the word "verified" on their screen.

Spoofing makes it worse. A video played in front of a camera or a three-D printed mask can fool age verification systems into false positives. Assuming no one bothered with a deepfake is a dangerous bet.


The Bottom Line

The credential proves the document is real. Only a human comparing faces proves the person is real.

A verified credential means the digital object hasn't been faked. It doesn't mean the face matches. And in roughly five out of every hundred sessions, it doesn't. Next time you see "verified" on a profile, treat it as the starting line — not the finish. The written version goes deeper — learn Netanyahu Cafe Video Deepfake Evidence Authenticatabout the limitations of face recognition systems.

Ready to try AI-powered facial recognition?

Match faces in seconds with CaraComp. Free 7-day trial.

Start Free Trial