Deepfakes Felony Law in South Dakota Raises the Bar for Photo Evidence | Podcast
Deepfakes Felony Law in South Dakota Raises the Bar for Photo Evidence | Podcast
This episode is based on our article:
Read the full article →Deepfakes Felony Law in South Dakota Raises the Bar for Photo Evidence | Podcast
Full Episode Transcript
South Dakota's governor just signed a law making deepfake pornography a felony. Creating it, sharing it — both now carry serious prison time. And that single bill exposes a gap most people haven't noticed yet.
This isn't just one state cracking down
This isn't just one state cracking down. According to South Dakota Searchlight, the governor signed the bill on 03-17-2026. That same week, Brazil's mandatory age verification law went live. A few days earlier, Australia started requiring adult sites to verify users through facial scans or government I.D. Three countries, three laws, one message — governments are racing to regulate synthetic media and biometric identity before the technology outruns them. But can prosecutors actually prove a deepfake is a deepfake in court?
Start with the detection tools themselves. According to researchers published through N.I.H., current deepfake detection systems hover around eighty percent accuracy. That means roughly one in five fakes slips through. Worse, many of these tools can't explain their own reasoning. They spit out a confidence score — say, ninety-two percent likely fake — but they don't show the steps that got them there. A defense attorney doesn't need to prove an image is real. They just need to prove the tool that flagged it can't explain itself. And that's enough to create reasonable doubt.
Now zoom out. According to Gartner's projection cited by Deep Media, nearly a third of enterprises won't trust identity verification that relies on face biometrics alone by the end of this year. Not because biometrics don't work — because deepfakes have made single-layer verification unreliable. So governments are pushing harder. Australia's law forced platforms like Pornhub to either scan users' faces or block access entirely. Pornhub chose to block Australian users altogether. According to Yahoo Tech, Brazil's law triggered a two-hundred-fifty percent spike in V.P.N. sign-ups almost overnight. People didn't comply — they routed around the requirement.
The Bottom Line
So what does that leave investigators with? Manual visual comparison — just looking at two photos side by side — is becoming legally indefensible. Courts increasingly want documented methodology. They want to see records of how a match was made, not just that someone eyeballed it. A facial comparison showing measured distance analysis with explicit confidence thresholds holds up. An A.I. detector that says "probably fake" without showing its work doesn't.
The real tension isn't between real and fake images. It's that lawmakers are criminalizing deepfakes faster than anyone can reliably detect them in a courtroom.
Governments worldwide are making synthetic media a serious crime. The detection tools aren't ready for cross-examination. And millions of users are dodging biometric mandates with V.P.N.s instead of complying. The professionals who'll come out ahead are the ones documenting their analysis process now — before a judge asks them to. The written version goes deeper — link's below.
Ready to try AI-powered facial recognition?
Match faces in seconds with CaraComp. Free 7-day trial.
Start Free TrialMore Episodes
27 Million Gamers Face Mandatory ID Checks for GTA 6 — Your Cases Are Next
Twenty-seven million people. That's how many gamers in Australia may need to hand over a photo I.D. or a face scan just to play Grand Theft Auto 6 online. One video game title, one country, and sudden
PodcastA 0.78 Match Score on a Fake Face: How Facial Geometry Stops Deepfake Wire Scams
A deepfake video call can reduce a human face to a string of a hundred and twenty-eight numbers in under two hundred milliseconds. And according to a report by Resemble.ai, deepfake fraud damage hit three hundred and fif
PodcastDeepfakes Force New Identity Rules — And Investigators’ Evidence Is on the Line
Nudification apps — tools that use A.I. to digitally undress people in photos — have been downloaded more than seven hundred million times. That's not a typo. Seven hundred million downloads of softwa
