CaraComp
Log inTry Free
CaraComp
Forensic-Grade AI Face Recognition for:
Start Free Trial
Podcast

Why Must 1.4 Million Women Scan Their Faces to Hand Out Rice?

Why Must 1.4 Million Women Scan Their Faces to Hand Out Rice?

Why Must 1.4 Million Women Scan Their Faces to Hand Out Rice?

0:00-0:00

This episode is based on our article:

Read the full article →

Why Must 1.4 Million Women Scan Their Faces to Hand Out Rice?

Full Episode Transcript


In India, about one and a half million women — most of them earning less than two dollars a day — now have to scan their own faces on a government app before they're allowed to hand out bags of rice to pregnant mothers and malnourished children. If the scan fails, the food doesn't move. And the worker gets a warning letter.


These women are called Anganwadi workers

These women are called Anganwadi workers. They run India's network of rural nutrition centers — feeding some of the poorest families in the world under a program called POSHAN 2.0. Last July, India's Ministry of Women and Child Development told every one of them that distributing take-home rations now requires facial recognition and biometric verification — for the worker and the person receiving the food. No successful face scan, no rations. That rule applies even when the worker personally knows every single beneficiary by name. And if you've ever tried to use a finicky app on a weak cell signal, you already know where this is heading. According to field reports, the system is roughly ninety percent ineffective in areas with poor rural connectivity. So the question at the center of a case now before Karnataka High Court is simple: why does handing out rice require a face scan that almost never works?

The court filed its challenge on 04-23-2026, demanding the Indian government justify the mandate. And the details from the ground tell you why.

Each verification — the biometric check tied to India's national I.D. system, Aadhaar, plus a one-time password, plus a facial scan with liveness detection — takes about twenty minutes per person. One beneficiary. Twenty minutes. Anganwadi workers say the extra authentication adds two to three hours to their daily workload. That's two to three hours they used to spend actually visiting homes, checking on children's growth, talking to new mothers. Now they're standing in front of a phone, waiting for a loading screen.


Trusted by Investigators Worldwide
Run Forensic-Grade Comparisons in Seconds
2 free forensic comparisons with full reports. Results in seconds.
Run My First Search →

The system demands three things to go right at the

And the system demands three things to go right at the same time: the beneficiary needs a smartphone or an Aadhaar-linked mobile number, the government's Poshan app has to be functioning, and the facial scan has to match. In rural India, all three fail regularly. When they do, the worker can't override it. She can be standing face to face with a pregnant woman she's known for years, and the app says no, and the food stays locked. That's not a glitch. That's the design.

As of last August, about three quarters of beneficiaries had completed the biometric enrollment. Which means roughly one in four — the hardest to reach, the least connected — still hadn't. Those are the families most likely to need the rations. And they're the ones the system is most likely to exclude.

The All India Federation of Anganwadi Workers and Helpers has publicly condemned the mandate. They argue it violates India's National Food Security Act, which guarantees access to nutrition. India's own Aadhaar Act actually requires that alternative means be provided whenever biometric authentication fails or isn't available. The mandate ignores that requirement. Workers who can't get the app to cooperate don't get a workaround. They get show-cause notices — formal warnings that can cost them their jobs.


What did the government say this was for

What did the government say this was for? Preventing duplication and leakages. Fraud, in other words. But the Pulitzer Center spoke with more than fifteen Anganwadi workers across three Indian states, and none of them reported evidence that the system had caught significant fraud. The Internet Freedom Foundation ran a proportionality analysis and reached a blunt conclusion: a system built to catch marginal fraud is excluding the very people it was designed to protect.

The constitutional challenge rests on two pillars of Indian law. Article Fourteen — the right to equality. Article Twenty-One — the right to life and dignity. When a biometric system becomes the only gateway to food — which is a constitutional right in India — and that system predictably fails for the most disadvantaged users, the court has to ask whether the technology serves the right or blocks it.

This isn't the first time a government has tried mandatory facial recognition on a vulnerable population and had courts push back. In 2019, Sweden fined a municipality for using facial recognition to track student attendance. The regulator called it disproportionate. The Netherlands struck down an entire welfare-fraud detection system for violating privacy and disproportionately targeting vulnerable groups. India's case is larger in scale — one and a half million workers, tens of millions of beneficiaries — but the legal logic is the same. When the cost of a false rejection is someone going hungry, the standard for deploying that technology has to be higher.


The Bottom Line

And there's a layer beneath the ration distribution that makes this even more significant. The facial recognition isn't just verifying that food reached the right person. It's functioning as a mandatory check-in. Beneficiaries who don't periodically authenticate can be deleted from the welfare rolls entirely — regardless of whether they actually received food. That turns a nutrition program into a surveillance system. Accountability packaged as efficiency.

The government framed this as a tool to fight fraud. But the fraud it was built to catch hasn't materialized, and the harm it's causing is live and documented. When the cure is worse than the disease, the question isn't whether the technology works — it's who it's really working for.

India made face scans mandatory for distributing food to its poorest families. The system fails most of the time in the places that need it most. And the workers and mothers paying the price had no say in the decision. Whether you build biometric systems or you just unlock your phone with your face every morning, this case draws a line. A face scan that stands between a mother and a meal isn't verification. It's a gate. The full story's in the description if you want the deep dive.

Ready for forensic-grade facial comparison?

2 free comparisons with full forensic reports. Results in seconds.

Run My First Search