CaraComp
Log inTry Free
CaraComp
Forensic-Grade AI Face Recognition for:
Start Free Trial
Podcast

ICE's New 'Google Maps' for People: Confidence Score, Wrong Neighborhood, Real Consequences

ICE's New 'Google Maps' for People: Confidence Score, Wrong Neighborhood, Real Consequences

ICE's New 'Google Maps' for People: Confidence Score, Wrong Neighborhood, Real Consequences

0:00-0:00

This episode is based on our article:

Read the full article →

ICE's New 'Google Maps' for People: Confidence Score, Wrong Neighborhood, Real Consequences

Full Episode Transcript


An I.C.E. official sat before Congress, described a tool the agency uses to find people, and compared it to Google Maps. Not Google Maps for directions. Google Maps for people. The official said the system shows neighborhoods where a person might be located. Then, under oath, admitted something else — the tool could be wrong, even when it expressed high confidence.


That tool is part of a sprawling surveillance

That tool is part of a sprawling surveillance system built by Palantir for the Department of Homeland Security. And thirty members of Congress — led by Representatives Dan Goldman and Nydia Velázquez and Senator Ron Wyden — just sent a letter demanding answers about how it works, who it tracks, and what limits exist. Their deadline is 04-24-2026. If you've ever had your photo taken in public, if you carry a phone, if you've ever been on a video call — this story is about what happens when a government builds a confidence score around your identity and your location, and then acts on it in the field. That knot you might feel hearing this isn't irrational. These systems were originally built for immigration enforcement. According to reporting from Biometric Update, facial recognition, biometric scanning, and social media monitoring — once justified for tracking noncitizens — are now being used to identify and investigate U.S. citizens. So the question threading through all of this is simple. When a machine says it's confident about who you are and where you live, what burden of proof protects you from becoming a target?

Start with the system itself. D.H.S. uses a platform called FALCON. Palantir built it for I.C.E.'s Homeland Security Investigations division. FALCON doesn't just search one database. It integrates and searches across dozens of government and commercial datasets at once. It links with forensic phone tools. It has a mobile app that supports G.P.S. tracking, secure messaging, and real-time field interview reporting. An agent in the field can pull up a person's connected identifiers — addresses, phone numbers, devices — and generate leads on the spot. That's a fundamental shift from how biometric systems used to work. The old model was a centralized database lookup. You submitted a photo, waited for results, then an analyst reviewed the match. This is different. This is real-time, field-actionable matching — probabilistic targeting at scale. For anyone who's ever wondered whether the camera at the grocery store or the airport could somehow follow you home, this is the architecture that makes that technically possible.

Now look at the money. D.H.S. put a one-billion-dollar ceiling on a single blanket purchase agreement with Palantir. That agreement went into effect in February. A billion dollars. That's not a pilot program someone might cancel next quarter. That's infrastructure. It signals that the federal government is building this capability to last.


Trusted by Investigators Worldwide
Run Forensic-Grade Comparisons in Seconds
2 free forensic comparisons with full reports. Results in seconds.
Run My First Search →

Then there's the tool that I

Then there's the tool that I.C.E. official compared to Google Maps — an application called ELITE. ELITE doesn't give agents a specific address and say, "This person is here." It gives them a neighborhood. An area. A zone of probability. And it attaches a confidence score. That distinction matters enormously. A confidence score is not an identity confirmation. It's a probability. But when agents see a high number on a screen, the human instinct is to trust it. What does it mean for the people who happen to live in that neighborhood? If the system is wrong — and the official acknowledged it can be — those people become collateral in someone else's search. For investigators, this raises hard questions about probable cause and whether a confidence score meets the legal threshold for a warrant. For the rest of us, it means a machine's best guess about your block could put federal agents on your street.

The congressional letter gets specific about what lawmakers want to know. They're asking whether any D.H.S. analytics tools collect or retain personally identifiable information belonging to U.S. citizens. They want to know what legal authorities D.H.S. relies on to collect and keep that data. And they're asking what safeguards exist to limit how long it's stored and who can access it. Those aren't abstract policy questions. They're asking: does the government have a file on you, and if so, who's watching the people watching you?

Palantir pushes back on this framing. The company's human-rights policy states that customers own their data — Palantir doesn't own, collect, store, or sell personal information outside necessary internal business practices. C.E.O. Alex Karp has argued that critics of I.C.E. should actually want more Palantir-style controls, not fewer, because the platform includes audit logs and permissions that can restrain government work. That's a real argument. Audit trails matter. But a locked door still opens for somebody.


The Bottom Line

The shift everyone should notice isn't about accuracy. It's about speed. When matching moves from a database lookup to field-actionable lead generation, the entire risk model changes. Software that makes enforcement faster, cleaner, and more searchable moves the political fight from whether the state can act to how many names fit on the screen.

So — the short version. The federal government spent a billion dollars on a surveillance platform that lets agents search dozens of databases at once, track people by phone and location, and generate confidence scores about where someone might be — scores that can be wrong even when they look certain. Thirty members of Congress are demanding to know whether that system is collecting data on American citizens, and what rules govern it. Whether you've ever thought about biometrics or not, your photo, your phone, and your neighborhood are already the kind of data these systems consume. Knowing that doesn't have to make you afraid. But it should make you pay attention. The full story's in the description if you want the deep dive.

Ready for forensic-grade facial comparison?

2 free comparisons with full forensic reports. Results in seconds.

Run My First Search