CaraComp
Log inStart Free Trial
CaraComp
Forensic-Grade AI Face Recognition for:
Start Free Trial
Podcast

Deepfakes Felony Law in South Dakota Raises the Bar for Photo Evidence | Podcast

Deepfakes Felony Law in South Dakota Raises the Bar for Photo Evidence

Deepfakes Felony Law in South Dakota Raises the Bar for Photo Evidence | Podcast

0:00-0:00

This episode is based on our article:

Read the full article →

Deepfakes Felony Law in South Dakota Raises the Bar for Photo Evidence | Podcast

Full Episode Transcript


South Dakota's governor just signed a law making deepfake pornography a felony. Creating it, sharing it — both now carry serious prison time. And that single bill exposes a gap most people haven't noticed yet.


Trusted by Investigators Worldwide
Run Forensic-Grade Comparisons in Seconds
Full platform access for 7 days. Run real searches — no credit card, no commitment.
Run My First Search →

This isn't just one state cracking down

This isn't just one state cracking down. According to South Dakota Searchlight, the governor signed the bill on 03-17-2026. That same week, Brazil's mandatory age verification law went live. A few days earlier, Australia started requiring adult sites to verify users through facial scans or government I.D. Three countries, three laws, one message — governments are racing to regulate synthetic media and biometric identity before the technology outruns them. But can prosecutors actually prove a deepfake is a deepfake in court?

Start with the detection tools themselves. According to researchers published through N.I.H., current deepfake detection systems hover around eighty percent accuracy. That means roughly one in five fakes slips through. Worse, many of these tools can't explain their own reasoning. They spit out a confidence score — say, ninety-two percent likely fake — but they don't show the steps that got them there. A defense attorney doesn't need to prove an image is real. They just need to prove the tool that flagged it can't explain itself. And that's enough to create reasonable doubt.

Now zoom out. According to Gartner's projection cited by Deep Media, nearly a third of enterprises won't trust identity verification that relies on face biometrics alone by the end of this year. Not because biometrics don't work — because deepfakes have made single-layer verification unreliable. So governments are pushing harder. Australia's law forced platforms like Pornhub to either scan users' faces or block access entirely. Pornhub chose to block Australian users altogether. According to Yahoo Tech, Brazil's law triggered a two-hundred-fifty percent spike in V.P.N. sign-ups almost overnight. People didn't comply — they routed around the requirement.


The Bottom Line

So what does that leave investigators with? Manual visual comparison — just looking at two photos side by side — is becoming legally indefensible. Courts increasingly want documented methodology. They want to see records of how a match was made, not just that someone eyeballed it. A facial comparison showing measured distance analysis with explicit confidence thresholds holds up. An A.I. detector that says "probably fake" without showing its work doesn't.

The real tension isn't between real and fake images. It's that lawmakers are criminalizing deepfakes faster than anyone can reliably detect them in a courtroom.

Governments worldwide are making synthetic media a serious crime. The detection tools aren't ready for cross-examination. And millions of users are dodging biometric mandates with V.P.N.s instead of complying. The professionals who'll come out ahead are the ones documenting their analysis process now — before a judge asks them to. The written version goes deeper — link's below.

Ready to try AI-powered facial recognition?

Match faces in seconds with CaraComp. Free 7-day trial.

Start Free Trial