Contents
Introduction
India, with 14 lakh anganwadis serving 100 million women and children, increasingly relies on digital welfare tools like Face Recognition Software (FRS). However, technological determinism risks undermining equity, dignity, and inclusive governance outcomes.
Challenges of FRS in Welfare
- Violation of Dignity and Presumption of Guilt: FRS treats women and children as potential fraudsters rather than beneficiaries with rights. Contravenes Article 21 (Right to Life with Dignity) and natural justice principles of “innocent until proven guilty.”
- Exclusion Errors and Denial of Rights: Network failures, device limitations, and mismatched biometrics result in denial of Take-Home Rations (THR). Similar exclusion seen in Aadhaar-based PDS, where the State of Jharkhand (2017-18) reported starvation deaths due to biometric authentication failures.
- Governance Deficit and Lack of Consultation: FRS was introduced without dialogue with Anganwadi Workers (AWWs) or communities. Contravenes Participatory Governance principles emphasized by the Second Administrative Reforms Commission (ARC).
- Technological Myopia: Ignores real challenges: poor ration quality, irregular supply, stagnant budget (₹8 per child/day since 2018), and corruption in contracts. Risks substituting “techno-solutionism” for systemic reforms.
- Surveillance and Privacy Risks: Mandatory face scans blur the line between welfare and policing, especially as FRS is typically used in criminal investigations. Lacks safeguards under the Digital Personal Data Protection Act (2023), raising fears of profiling and misuse.
- Global Standards and Concerns: San Francisco and several EU states restrict or ban FRS for civil purposes due to accuracy and ethical concerns. UN’s Special Rapporteur on Extreme Poverty (2019) warned against digital welfare turning into “digital welfare dystopia.”
Way Forward: Towards Inclusive Technological Governance
- Principle of Proportionality (Puttaswamy judgment, 2017): Tech use must be necessary, least intrusive, and rights-compatible.
- Community-Based Verification: Empower SHGs, local panchayats, and women’s collectives for monitoring—upholding decentralization under NFSA 2013 and SC orders (2004).
- Technology as Enabler, Not Gatekeeper: Introduce offline-first solutions, grievance redressal, and override options for AWWs.
- Transparency and Accountability: Publish fraud data, audit algorithms, and ensure social audits under MGNREGA-type mechanisms.
- Capacity Building: Train AWWs in digital tools, upgrade infrastructure, and ensure adequate devices and connectivity.
- Ethical Tech Charter: Adopt UNESCO’s AI Ethics Recommendation (2021) to ensure fairness, accountability, and human rights compliance in welfare tech.
Conclusion
As Development as Freedom reminds us, technology must expand capabilities, not restrict them. Welfare delivery should prioritize dignity, equity, and inclusion, ensuring machines serve citizens not citizens machines.


