CaraComp
Log inTry Free
CaraComp
Forensic-Grade AI Face Recognition for:
Start Free Trial
Podcast

Facial Recognition Isn't on Trial. Your Explanation Is.

Facial Recognition Isn't on Trial. Your Explanation Is.

Facial Recognition Isn't on Trial. Your Explanation Is.

0:00-0:00

This episode is based on our article:

Read the full article →

Facial Recognition Isn't on Trial. Your Explanation Is.

Full Episode Transcript


In Illinois right now, lawmakers are pushing a bill that would ban police from using facial recognition. At the same time, the T.S.A. scans travelers' faces at more than two hundred fifty airports across the country. Same technology. Opposite directions.


If you've ever walked through airport security or

If you've ever walked through airport security or had your photo taken in public, this story is about you. Because the rules around who can scan your face, and why, depend entirely on where you're standing and who's doing the scanning. That's not a hypothetical. It's already the law. Illinois House Bill fifty-five twenty-one picked up two new co-sponsors after stalling in committee earlier this year. If it passes, police in one of the largest states in the country would lose access to a tool that federal agents use on millions of travelers every single day. Meanwhile, in January of twenty twenty-five, the D.H.S. Inspector General launched an audit of how the T.S.A. actually deploys that same facial comparison technology. So the question threading through all of this isn't whether facial recognition works. It's whether anyone can clearly explain the difference between how it's used — and whether that explanation holds up.

Start with Robert Williams. He's a forty-five-year-old Black man from Detroit. In twenty twenty, an investigator leaned almost entirely on a facial recognition match to get an arrest warrant against him. The algorithm got it wrong. Williams was not the person who committed the crime. He was arrested anyway. That case became a turning point — not because the technology failed in some exotic way, but because a detective treated a match like a conclusion instead of a starting point. One person. One shortcut. One wrongful arrest that changed the national conversation.

And that's exactly the distinction investigators on the ground say matters most. Retired Riverside Police Chief Tom Weitzel has called facial recognition one of the most important investigative tools to arrive in policing in half a century. But he and other law enforcement voices are quick to draw a line. A facial recognition hit is supposed to work like a fingerprint match or a D.N.A. result — it generates a lead, not a conviction. Detectives are supposed to take that lead and corroborate it with witness statements, surveillance footage, physical evidence. The technology names a possibility. The investigation confirms or eliminates it. That's the methodology. But legislatures and the public often collapse that whole process into one word — surveillance — and treat every use of the technology as identical.


Trusted by Investigators Worldwide
Run Forensic-Grade Comparisons in Seconds
2 free forensic comparisons with full reports. Results in seconds.
Run My First Search →

Zoom out. According to a C.S.I.S. analysis of

Now zoom out. According to a C.S.I.S. analysis of state-level regulation, fifteen states had enacted laws governing how police use facial recognition by early twenty twenty-five. More than sixteen cities have passed outright bans, starting with San Francisco back in twenty nineteen. Washington State took a middle path — its law says police can't use a facial recognition result as the sole basis for establishing probable cause. They can use it. They just can't rely on it alone. That's a guardrail, not a wall. But no similar restriction applies to the T.S.A., because the T.S.A. isn't investigating crimes. According to the agency's own fact sheet, its facial comparison technology automates the manual I.D. check that agents already perform. It's credentialing — confirming you are who your boarding pass says you are. So one use is investigative targeting. The other is identity verification. Legally, they're treated as completely different activities. For the person whose face gets scanned, though, it can feel exactly the same.

And the notice problem makes that worse. Accounts collected by the Algorithmic Justice League allege that most passengers receive little or no clear information about their right to refuse a face scan at the airport. Signage at checkpoints often uses phrases like "biometric identity technology" instead of plainly saying "facial recognition." If you didn't know you could opt out, you probably didn't. That gap between what's technically voluntary and what feels mandatory — that's where public trust breaks down.

Opponents of the technology point to something real. People have been misidentified and held for hours, sometimes days, based on system errors. Those cases rarely make headlines the way a solved murder does. But in the past two years, New Jersey, Maryland, and Montana have all added disclosure requirements — meaning if police use facial recognition during an investigation, they now have to tell the defendant. That's a significant shift. For years, according to the American Bar Association, departments routinely kept facial recognition use hidden from the people it affected — including people who were charged. Defense attorneys couldn't challenge evidence they didn't know existed.


The Bottom Line

The real divide isn't between people who support the technology and people who oppose it. It's between people who can explain how they use it and people who can't. A categorical ban and unchecked deployment are two sides of the same problem — both skip the explanation.

So — facial recognition is spreading fast, but the rules depend on who's using it, where, and for what purpose. A match from an algorithm isn't proof of anything. It's a starting point. And the biggest risk right now isn't the technology itself — it's the gap between how it actually works and how it's understood by courts, lawmakers, and the rest of us. Whether you're building a case or just boarding a flight, the face scan already happened. What matters now is whether anyone can tell you exactly why. The full story's in the description if you want the deep dive.

Ready for forensic-grade facial comparison?

2 free comparisons with full forensic reports. Results in seconds.

Run My First Search