Australia Just Made Face-Matching Obsolete. Here's the New Bar Every ID System Must Clear.
Australia Just Made Face-Matching Obsolete. Here's the New Bar Every ID System Must Clear.
This episode is based on our article:
Read the full article →Australia Just Made Face-Matching Obsolete. Here's the New Bar Every ID System Must Clear.
Full Episode Transcript
Australia's tax office just put out a call for new facial liveness detection technology. Not because the old system broke. Because the people trying to fool it got better.
That system — called myID — serves about fourteen
That system — called myID — serves about fourteen million people. It's how Australians prove who they are to their government online. And right now, the tools protecting it haven't been updated since 2021. In A.I. years, that's a lifetime. If you've ever unlocked your phone with your face, or verified your identity on an app by blinking at a camera, this is the same idea — just at national scale. The question running through this whole story is simple. When a country of fourteen million people raises the bar for what counts as proof that a real human is on the other end of a camera — does that become the new standard everywhere?
So what exactly is Australia asking for? According to Biometric Update, the Australian Taxation Office published what's called a Request for Information — basically, a formal call to the industry saying, "Show us what you've got." They want a cloud-delivered liveness detection system that can handle ten thousand identity checks every hour. Each one finishing in under a second. That's the kind of speed and volume you'd need in a system where millions of people are logging in to file taxes, access benefits, or prove who they are to a government agency.
Now, liveness detection is different from face matching. Face matching asks, "Do these two photos look like the same person?" Liveness detection asks a harder question — "Is there actually a living human being in front of this camera right now, or is someone holding up a photo, playing a video, wearing a mask, or injecting a deepfake into the feed?" That distinction matters more than it used to. Europol has flagged deepfakes as a tool that organized crime groups are expected to treat as routine. Not exotic. Routine. And face-swap technology has improved fast enough that the old defenses — the ones Australia deployed four years ago — can't keep up.
The new requirements Australia laid out are specific
The new requirements Australia laid out are specific. Any liveness system they adopt has to meet a standard called I.S.O. thirty-one-oh-seven dash three, published in 2023. That's the international benchmark for presentation attack detection — P.A.D. for short. P.A.D. just means the system can catch someone trying to present a fake version of a face. A printed photo. A replayed video on a tablet. A three-D silicone mask. A deepfake streamed through software. And Australia isn't taking vendors at their word. They're requiring what's called Evaluation Assurance Level two certification, verified by a qualified third party. An independent lab has to confirm the system actually does what the vendor claims.
Why does that matter beyond Australia's borders? Because it creates a dividing line. On one side, you've got facial comparison tools that have been independently tested against spoofing attacks and certified to an international standard. On the other side, you've got tools that just say they're accurate. No independent testing. No certification. When those tools produce evidence that ends up in a courtroom, opposing counsel now has a very clear question to ask — "Was this tool certified to the same standard that a G-20 nation requires for its own citizens?" If the answer's no, that evidence just got a lot easier to challenge. And that doesn't just affect investigators or lawyers. It affects anyone whose face could end up in a database, which at this point is most of us.
There's another layer worth paying attention to. According to research published by Keyless, liveness detection on its own isn't enough. Systems that combine liveness checks with a second factor — like verifying the specific device being used — are significantly harder to fool than liveness alone. Australia's move toward that layered approach sets a precedent. A single check — even a good one — no longer clears the bar for high-stakes identity decisions. That's a shift from how a lot of systems still operate today.
The Bottom Line
The part most people miss is this — Australia didn't upgrade because something went wrong. They upgraded because the attackers improved. And that means every system that hasn't kept pace just fell behind a standard it didn't even know was coming.
So — a country that serves fourteen million people through digital identity just said face matching alone isn't good enough anymore. They want proof that a real person is actually there, tested by an independent lab, against an international standard. That bar is now visible to every courtroom, every regulator, and every vendor in the world. Whether you're building a case or just scanning your face to log in to an app, the definition of "verified" just changed. The full story's in the description if you want the deep dive.
Ready for forensic-grade facial comparison?
2 free comparisons with full forensic reports. Results in seconds.
Run My First SearchMore Episodes
Why Your Eyes Can't Spot a Deepfake — And What Actually Can
More than half the time, you can't tell a deepfake from a real video. According to recent research published in Scientific Reports, about fifty-three and a half percent of people are fooled by digitally altered media. <b
PodcastDeepfake Laws Are Fracturing. Your Evidence May Not Survive 2026.
Twenty-six states have passed laws targeting deepfakes in elections. Not one federal law bans a deepfake political ad. And California's attempt to pass one? A court struck it down on First Amendment
PodcastDeepfake Fraud Just Broke Your Intake Process — Here's What Investigators Need to Fix Now
Ireland's Deputy Prime Minister Simon Harris recently watched a video of himself endorsing a financial product. He didn't remember making it. Because he never did. According to t
