CaraComp
Log inTry Free
CaraComp
Forensic-Grade AI Face Recognition for:
Start Free Trial
biometrics

Deepfake Injection Attacks Jumped 783% — and Single-Factor Biometrics Still Dominate KYC

Deepfake Injection Attacks Jumped 783% — and Single-Factor Biometrics Still Dominate KYC

An AI tool called JINKUSU CAM is actively bypassing Know Your Customer checks at Binance, Coinbase, Kraken, and OKX right now. Not theoretically. Not in a lab. Right now, in the real world, using real-time facial mesh tracking to map synthetic expressions onto live verification streams. Live Bitcoin News broke the details — and if you're in fraud investigation, identity verification, or compliance, you should feel deeply uncomfortable reading them.

TL;DR

Deepfake tools are now specifically engineered to defeat biometric KYC — injection attacks surged 783% in 2024 — and investigators who treat a single biometric match as proof of identity are walking into a trap regulators built for them.

Here's the maddening part. While JINKUSU CAM and tools like it are quietly dismantling first-generation identity verification, regulators across four continents are doubling down on biometrics as the answer to fraud. Egypt's Ministry of Interior just launched a biometric app for online government services. Punjab rolled out biometric vehicle verification. Paytm added biometric checks for UPI transactions. The world is adding more biometric gates — at the exact moment fraudsters have a master key.

That's not a security strategy. That's a coordinated march toward a cliff nobody mapped.


The Numbers That Should Terrify Every KYC Team

Let's talk about scale, because the scale here is genuinely staggering.

783%
increase in deepfake injection attacks against biometric verification systems in 2024 alone
Source: World Economic Forum Cybercrime Atlas, January 2026

That 783% figure comes from the World Economic Forum's January 2026 Cybercrime Atlas, which examined 17 face-swapping tools and eight camera injection tools in active circulation. The finding? Most of them could bypass standard biometric onboarding checks. Not some. Most. And then the year-on-year acceleration continued — Jumio recorded an 88% rise in injection attack attempts through 2025. This isn't a trend line going up gradually. This is a near-vertical spike. This article is part of a series — start with Deepfake Bills Photo Evidence Investigators 2026.

At a single unnamed financial institution, analysts recorded 8,065 attempts to defeat liveness checks using AI-generated deepfake images between January and August 2025. Eight months. One institution. Over eight thousand attempts. Meanwhile, in Indonesia, one financial firm was hit with more than 1,100 coordinated deepfake attacks using over 1,000 fraudulent accounts spread across 45 mobile devices — showing that this isn't disorganized opportunism. It's scaled, methodical fraud infrastructure.

And the cost to entry? Embarrassingly low. Biometric Update's coverage of the Deepfake-as-a-Service market puts custom deepfakes at $10–$50 per clip. Pre-made synthetic identities run about $15. According to Sumsub, AI-generated fake IDs cost as little as $15 to produce. The barrier to mounting a sophisticated KYC attack is now lower than a monthly Netflix subscription.


The Core Problem Nobody Wants to Say Out Loud

Biometric verification systems ask one question: "Does this face match the record on file?" They don't ask the question that actually matters for fraud prevention: "Is there a real, live human being presenting this face right now?"

That distinction is everything. Injection attacks don't walk in front of a camera and try to fool it with a printed photo — that's 2018 fraud. Modern attacks bypass the camera entirely, injecting manipulated video directly at the API layer. The system never sees the real world. It sees only what the attacker feeds it, and what the attacker feeds it passes every check the system is designed to run.

"Synthetic faces and virtual camera tools with API injection techniques can defeat all components of eKYC system security, and criminals are actively combining these methods to get through live KYC verification at financial institutions worldwide." — Technical analysis via Facia AI

Detection technology has improved, sure. Advanced frameworks achieve 97% accuracy under controlled lab conditions — impressive numbers that researchers are justifiably proud of. But lab conditions and real-world deployment are entirely different beasts. In the lab, the attacker doesn't get to iterate for months against your specific system. In the lab, injection attacks aren't rerouting the data stream before it reaches the detector. Real-world performance degrades sharply once attackers study the system, and attackers have every incentive to study the system because the payoff is account access at a major crypto exchange. Previously in this series: Why A Deepfake Face Can Fool Your Eyes In Seconds But Not 12.

According to Signzy, Gartner predicts that by 2026, 30% of enterprises will no longer consider face biometric verification reliable when used as a standalone check. Think about that timeline. We're not talking about a distant theoretical future — we're talking about a threshold Gartner thinks we'll cross within the year. And yet compliance teams are still treating the selfie-plus-ID-upload as the finish line of their verification process.

Why This Should Change How You Work Right Now

  • Single biometric checks are now a weak signal — not a verification gate. Treating them as the latter means you've already lost before the investigation starts.
  • 📊 Injection attacks bypass the camera, not just fool it — standard liveness detection doesn't catch API-layer manipulation. Your detection framework needs to account for where data enters the pipeline, not just what it looks like.
  • 🔮 Deepfake-as-a-Service has commoditized synthetic identity fraud — the skill floor just dropped to zero. Every investigator working remote onboarding cases should assume the baseline threat level is now what used to require nation-state resources.
  • 🧩 Corroborating evidence matters more than it ever has — device fingerprinting, metadata consistency, behavioral signals during onboarding, and cross-photo facial comparison across multiple images catch what a single liveness check cannot.

Trusted by Investigators Worldwide
Run Forensic-Grade Comparisons in Seconds
2 free forensic comparisons with full reports. Results in seconds.
Run My First Search →

What Sharp Investigators Are Actually Doing

The investigators ahead of this problem aren't waiting for regulators to catch up. They've already internalized that facial biometrics are one input signal among many — and arguably the easiest one to spoof at scale.

Cross-photo facial comparison is where serious verification work happens now. Not "does this selfie match this document photo" — but systematic comparison across multiple images from different sources, checking Euclidean distance between facial landmarks, consistency in lighting physics across frames, expression coherence across a sequence. Deepfakes can defeat one detector. They struggle to defeat five simultaneously when those five are examining different aspects of image physics, metadata, and behavioral sequencing. According to Identity.com, there are over 2,000 face-swap tools globally — with at least 47 specifically designed to defeat KYC systems. Forty-seven tools purpose-built for this exact attack vector. That's not a niche threat.

The Axios newsroom compromise — where a journalist was socially engineered via an AI deepfake trap — showed that even media-savvy professionals can be defeated by well-executed synthetic identity fraud. When journalists are getting burned, the assumption that compliance teams are routinely catching these attempts in real-time KYC flows starts to look extremely optimistic.

At CaraComp, the approach our investigators use doesn't treat any single biometric event as dispositive. Facial comparison across multiple source images, combined with behavioral metadata and liveness signals, provides the kind of layered scrutiny that a $15 synthetic identity simply cannot survive. The deepfake can fool the selfie check. It cannot simultaneously fool a multi-image comparison engine analyzing pixel-level consistency, a device fingerprint validator, and a behavioral anomaly flag — all running in parallel. Up next: Deepfake Injection Attacks Jumped 783 And Single Factor Biom.

That's the practical difference between using biometrics as a truth source and using them as one instrument in a larger detection ensemble.


The Regulatory Contradiction Nobody Is Addressing

Here's a genuinely absurd situation playing out in parallel: institutions are being sued for collecting biometric data without consent — Coinbase faces BIPA litigation over its facial scan collection during KYC, with potential penalties reaching into the millions — while simultaneously being pushed by regulators to expand biometric-based verification as the solution to the very fraud those biometrics enable.

So the compliance mandate is: collect more biometrics, or get fined for collecting biometrics. And by the way, those biometrics can be defeated with a $15 synthetic identity. Fantastic system, everyone.

The companies measuring fraud effectiveness with outdated KPIs are also getting caught flat-footed here. Regula's research found that many organizations are still tracking fraud metrics designed for the previous generation of attacks — metrics that don't account for injection attack rates, synthetic identity proliferation, or the failure modes specific to AI-generated media. You can't optimize for a threat you're not measuring.

Key Takeaway

If you're still treating a single biometric match as the finish line for KYC, you're aligning your defenses with yesterday's threat model. The investigators who will actually stop deepfake-driven fraud in 2026 are the ones who treat biometrics as one weak signal in a broader, layered verification strategy — and who start measuring injection attacks and synthetic identities as first-class risks, not edge cases.

Ready for forensic-grade facial comparison?

2 free comparisons with full forensic reports. Results in seconds.

Run My First Search