CaraComp
Log inTry Free
CaraComp
Forensic-Grade AI Face Recognition for:
Start Free Trial
Podcast

'Prove It's Not a Deepfake': The Evidence Challenge Most Investigators Will Lose

'Prove It's Not a Deepfake': The Evidence Challenge Most Investigators Will Lose

'Prove It's Not a Deepfake': The Evidence Challenge Most Investigators Will Lose

0:00-0:00

This episode is based on our article:

Read the full article →

'Prove It's Not a Deepfake': The Evidence Challenge Most Investigators Will Lose

Full Episode Transcript


An N.B.C. News investigation searched the names of thirty-six well-known female celebrities on Google and Bing. On Google, thirty-four of those searches returned nonconsensual deepfake pornography right at the top of the results. On Bing, it was thirty-five out of thirty-six. Those images were never real. The people in them never consented. And the biggest search engines on the planet served them up like they were just another result.


Trusted by Investigators Worldwide
Run Forensic-Grade Comparisons in Seconds
2 free forensic comparisons with full reports. Results in seconds.
Run My First Search →

That story alone would be enough to talk about

That story alone would be enough to talk about. But it's a doorway into something much bigger — something that touches anyone who's ever taken a photo, appeared on a security camera, or sent a video to a friend. Because the same technology that generates fake images of celebrities can generate fake evidence in a courtroom. And right now, federal rulemakers are debating how to deal with that. The Advisory Committee on Evidence Rules is considering a proposed addition — Rule 901(c) — designed specifically for what they call "potentially fabricated or altered electronic evidence." If adopted, possibly by twenty-twenty-six, any photo or video challenged as a deepfake would require the person who submitted it to prove it's authentic — or watch it get thrown out. That's a complete reversal of how evidence has worked for decades. So the question running through today's episode is this: in a world where any image can be faked, who has to prove what's real?

Start with how things work now. Today, if a prosecutor or a plaintiff puts a photograph into evidence, the other side can object. But the bar for getting that photo admitted is pretty low. You basically need testimony that the image fairly and accurately represents what it claims to show. That standard was written for a world of film cameras and darkrooms — not a world where a laptop and a free app can generate a photorealistic image of something that never happened. Under the proposed rule, the process would work in two steps. First, the party challenging the evidence would need to present actual grounds for believing it's fabricated. A bare assertion — just saying "that could be a deepfake" — wouldn't be enough. But once that threshold is met, the burden flips. The side that submitted the evidence would then have to demonstrate its authenticity by a preponderance of evidence. That's a higher standard than what courts currently require. It means "more likely than not this is real" — backed by documentation, not just someone's word.

And this isn't hypothetical. Courts have already bumped into this problem. According to an analysis published in the Berkeley Technology Law Journal, the Kyle Rittenhouse trial and the Huang versus Tesla case both involved challenges to video evidence on the grounds that it might have been altered or manipulated. In both instances, the courts had to improvise — because no clear rule existed for handling that kind of challenge. A separate report from the University of Colorado noted that in Alameda County, California, a court actually threw out testimony tied to deepfake material. These aren't edge cases anymore. They're early signals.

Now, what does "proving it's real" actually look like in practice? It means provenance — a full record of what was captured, when it was captured, where, and by whom. Timestamped metadata. A documented chain of custody showing who handled the file and what happened to it at every step. The Scientific Working Group on Digital Evidence — S.W.G.D.E. — already publishes best practices for exactly this kind of documentation. But according to the article's reporting, solo investigators routinely skip those steps. No log. No timestamp trail. No structured workflow. That gap between the standard and the practice is where evidence gets excluded. And it's not just a problem for detectives or forensic examiners. If you've ever screenshot a threatening message, saved a video from social media, or forwarded an image to a lawyer — you've handled digital evidence. And none of it came with a provenance trail.


The Bottom Line

There's also a cost problem that doesn't get enough attention. According to guidance from the National Association for Presiding Judges, litigants may need to hire digital forensics experts just to verify — or debunk — a single piece of evidence. That's expensive. Smaller law firms, under-resourced prosecutors, legal aid organizations — they don't have budget lines for that. So the rule designed to protect the integrity of evidence could also widen the gap between who can afford to prove their case and who can't. Some judges may delay adopting the new framework altogether, which could create two or three years of inconsistency — where the rules in one courtroom don't match the rules in another.

The deepfake problem isn't just that fake evidence might get in. It's that real evidence might get thrown out. Any party can now wave the word "deepfake" at a photograph and force the other side to spend time and money proving it's genuine. The technology doesn't just create fakes. It poisons trust in everything that's real.

So — the short version. Deepfakes aren't just a social media problem or a celebrity problem. They've reached the courtroom. Federal rulemakers are drafting new standards that would force anyone submitting a photo or video to prove it hasn't been fabricated — and that shift could take effect within two years. Whether you investigate cases for a living or you just have a phone full of photos you assume are real, the rules around what counts as proof are changing underneath all of us. The full story's in the description if you want the deep dive.

Ready for forensic-grade facial comparison?

2 free comparisons with full forensic reports. Results in seconds.

Run My First Search